See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/421/display/redirect?page=changes>
Changes: [relax] Ensure timer consistency in Dataflow and portable runners ------------------------------------------ [...truncated 63.54 KB...] > Task :runners:java-fn-execution:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:java-fn-execution:classes > Task :runners:java-fn-execution:jar > Task :sdks:java:expansion-service:compileJava Note: <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/java/expansion-service/src/main/java/org/apache/beam/sdk/expansion/service/ExpansionService.java> uses unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:expansion-service:classes > Task :sdks:java:expansion-service:jar > Task :sdks:java:io:kafka:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:kafka:classes > Task :sdks:java:io:kafka:jar > Task :sdks:java:io:google-cloud-platform:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:classes > Task :sdks:java:io:google-cloud-platform:jar > Task :runners:google-cloud-dataflow-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :runners:google-cloud-dataflow-java:****:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:****:classes > Task :runners:google-cloud-dataflow-java:****:shadowJar > Task :sdks:python:apache_beam:testing:load_tests:run INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.33.0.dev INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210720 INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210720" for Docker environment INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds. INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920081.783597/pickled_main_session... INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920081.783597/pickled_main_session in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920081.783597/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920081.783597/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920081.783597/dataflow-****.jar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920081.783597/dataflow-****.jar in 4 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920081.783597/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920081.783597/pipeline.pb in 0 seconds. WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job createTime: '2021-08-02T16:01:27.848833Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2021-08-02_09_01_27-17118644682173805589' location: 'us-central1' name: 'performance-tests-psio-python-2gb0802150506' projectId: 'apache-beam-testing' stageStates: [] startTime: '2021-08-02T16:01:27.848833Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-08-02_09_01_27-17118644682173805589] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-08-02_09_01_27-17118644682173805589 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-02_09_01_27-17118644682173805589?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-08-02_09_01_27-17118644682173805589 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:27.042Z: JOB_MESSAGE_BASIC: Streaming Engine auto-enabled. Use --experiments=disable_streaming_engine to opt out. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:27.043Z: JOB_MESSAGE_BASIC: Dataflow Runner V2 auto-enabled. Use --experiments=disable_runner_v2 to opt out. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:34.330Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.005Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.032Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.097Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.129Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create input/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.164Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.202Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.244Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.319Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.352Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.388Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Split into Create input/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.417Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/AddRandomKeys into Create input/Split INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.451Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create input/Reshuffle/AddRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.483Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create input/Reshuffle/ReshufflePerKey/Map(reify_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.514Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create input/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.552Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create input/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.593Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/RemoveRandomKeys into Create input/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.627Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/ReadSplits into Create input/Reshuffle/RemoveRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.662Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into Create input/ReadSplits INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.692Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.724Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.769Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write/NativeWrite into Write to Pubsub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.816Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.849Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.871Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.901Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.961Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:35.997Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:36.021Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:01:56.056Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:02:14.556Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:02:43.161Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:02:43.187Z: JOB_MESSAGE_DETAILED: Workers have started successfully. WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-08-02_09_01_27-17118644682173805589 after 603 seconds WARNING:apache_beam.transforms.core:GroupByKey: Unsafe trigger type (DataLossReason.CONDITION_NOT_GUARANTEED) detected. Starting with Beam 2.33, this will raise an error by default. Either change the pipeline to use a safe trigger or set the --allow_unsafe_triggers flag. INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.33.0.dev INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210720 INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210720" for Docker environment INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920852.128844/pickled_main_session... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920852.128844/pickled_main_session in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920852.128844/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920852.128844/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920852.128844/dataflow-****.jar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920852.128844/dataflow-****.jar in 4 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920852.128844/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0802150506.1627920852.128844/pipeline.pb in 0 seconds. WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job createTime: '2021-08-02T16:14:18.453125Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2021-08-02_09_14_17-15154953699949253447' location: 'us-central1' name: 'performance-tests-psio-python-2gb0802150506' projectId: 'apache-beam-testing' stageStates: [] startTime: '2021-08-02T16:14:18.453125Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-08-02_09_14_17-15154953699949253447] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-08-02_09_14_17-15154953699949253447 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-02_09_14_17-15154953699949253447?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-08-02_09_14_17-15154953699949253447 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:17.515Z: JOB_MESSAGE_BASIC: Streaming Engine auto-enabled. Use --experiments=disable_streaming_engine to opt out. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:17.516Z: JOB_MESSAGE_BASIC: Dataflow Runner V2 auto-enabled. Use --experiments=disable_runner_v2 to opt out. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:23.845Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:24.586Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:24.647Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:24.708Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:24.769Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:24.797Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:24.856Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:24.940Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.027Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.056Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.086Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.113Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.138Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.176Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.205Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.241Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.305Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.341Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.370Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.403Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.440Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write/NativeWrite into Write to Pubsub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.480Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.511Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.542Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.570Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.650Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.690Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:25.722Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:14:37.994Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:15:09.704Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:15:41.376Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-08-02T16:15:41.412Z: JOB_MESSAGE_DETAILED: Workers have started successfully. WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-08-02_09_14_17-15154953699949253447 after 602 seconds ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_edcac95a-81bc-405c-9eb6-8a4dadfcb25d_read_matcher. Traceback (most recent call last): File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module> PubsubReadPerfTest().run() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run self.result = self.pipeline.run() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run False if self.not_use_test_runner_api else test_runner_api)) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 565, in run return self.runner.run_pipeline(self, self._options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 69, in run_pipeline hc_assert_that(self.result, pickler.loads(on_success_matcher)) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that _assert_match(actual=arg1, matcher=arg2, reason=arg3) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-02_09_01_27-17118644682173805589?project=apache-beam-testing raise AssertionError(description) Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-02_09_14_17-15154953699949253447?project=apache-beam-testing AssertionError: Expected: (Expected 1 messages.) but: Expected 1 messages. Got 0 messages. Diffs (item, count): Expected but not in actual: dict_items([(b'2097152', 1)]) Unexpected: dict_items([]) > Task :sdks:python:apache_beam:testing:load_tests:run FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58 * What went wrong: Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'. > error occurred * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 52m 59s 83 actionable tasks: 70 executed, 13 from cache Publishing build scan... https://gradle.com/s/skpctj6iyjbws Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
