See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/646/display/redirect?page=changes>
Changes: [noreply] Mapped JOB_STATE_RESOURCE_CLEANING_UP to State.RUNNING. [ryanthompson591] fixed typo in typehints [zyichi] Remove unused prebuild_sdk_container_base_iamge option from validate [hengfeng] feat: add more custom metrics [noreply] [BEAM-14103][Playgrounf][Bugfix] Fix google analytics id (#17092) [noreply] Minor: Make ScopedReadStateSupplier final (#16992) [noreply] [BEAM-14113] Improve SamzaJobServerDriver extensibility (#17099) [noreply] [BEAM-14116] Chunk commit requests dynamically (#17004) [noreply] Merge pull request #17079 from [BEAM-13660] Add types and queries in [noreply] [BEAM-13888] Add unit testing to ioutilx (#17058) [noreply] Merge pull request #16822 from [BEAM-13841][Playground] Add Application [noreply] Minor: Make serializableCoder warning gramatically correct english [noreply] [BEAM-14091] Fixing Interactive Beam show/collect for remote runners [noreply] [BEAM-11934] Add enable_file_dynamic_sharding to allow DataflowRunner [noreply] [BEAM-12777] Create symlink for `current` directory (#17105) [noreply] [BEAM-14020] Adding SchemaTransform, SchemaTransformProvider, [noreply] [BEAM-13015] Modify metrics to begin and reset to a non-dirty state. ------------------------------------------ [...truncated 57.07 KB...] > Task :sdks:java:harness:jar > Task :runners:java-fn-execution:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:java-fn-execution:classes > Task :runners:java-fn-execution:jar > Task :sdks:java:expansion-service:compileJava Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:expansion-service:classes > Task :sdks:java:expansion-service:jar > Task :sdks:java:io:kafka:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:kafka:classes > Task :sdks:java:io:kafka:jar > Task :sdks:java:io:google-cloud-platform:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:classes > Task :sdks:java:io:google-cloud-platform:jar > Task :runners:google-cloud-dataflow-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :runners:google-cloud-dataflow-java:****:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:****:classes > Task :runners:google-cloud-dataflow-java:****:shadowJar > Task :sdks:python:apache_beam:testing:load_tests:run INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208 INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds. INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pickled_main_session... INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pickled_main_session in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow-****.jar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow-****.jar in 5 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pipeline.pb in 0 seconds. WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job clientRequestId: '20220317160300078338-9170' createTime: '2022-03-17T16:03:06.851624Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2022-03-17_09_03_06-16520541463599878889' location: 'us-central1' name: 'performance-tests-psio-python-2gb0317150459' projectId: 'apache-beam-testing' stageStates: [] startTime: '2022-03-17T16:03:06.851624Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-17_09_03_06-16520541463599878889] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-17_09_03_06-16520541463599878889 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_03_06-16520541463599878889?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-17_09_03_06-16520541463599878889 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:31.508Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.489Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.700Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.960Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.164Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.197Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.232Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.281Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.334Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.366Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.399Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>) INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.431Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.514Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.600Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.702Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.764Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.871Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.018Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.310Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.620Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.762Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.796Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b... INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:55.849Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:05.528Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:05.586Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:16.013Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:39.379Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:39.409Z: JOB_MESSAGE_DETAILED: Workers have started successfully. WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-17_09_03_06-16520541463599878889 after 600 seconds INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 799b919e335e4da5b604bf26cf2e95f7 and timestamp: 1647533794.896252: INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 118 INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208 INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pickled_main_session... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pickled_main_session in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow-****.jar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow-****.jar in 4 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pipeline.pb in 0 seconds. WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job clientRequestId: '20220317161639862367-7675' createTime: '2022-03-17T16:16:45.948426Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2022-03-17_09_16_45-14787157976749017395' location: 'us-central1' name: 'performance-tests-psio-python-2gb0317150459' projectId: 'apache-beam-testing' stageStates: [] startTime: '2022-03-17T16:16:45.948426Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-17_09_16_45-14787157976749017395] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-17_09_16_45-14787157976749017395 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_16_45-14787157976749017395?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-17_09_16_45-14787157976749017395 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:16:54.906Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:01.982Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.012Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.088Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.247Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.319Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.473Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.610Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.780Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.934Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str) INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.060Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>) INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.144Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.211Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.336Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.391Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.467Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.525Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.589Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.668Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.725Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.769Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.810Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.852Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.875Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.913Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.949Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.998Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.020Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b... INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.063Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.782Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:24.719Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:24.756Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:45.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:45.663Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:56.248Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:59.060Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:59.094Z: JOB_MESSAGE_DETAILED: Workers have started successfully. WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-17_09_16_45-14787157976749017395 after 601 seconds INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f3c179ef93754940a4bcf4371370c909 and timestamp: 1647534722.8106575: INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106 INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f3c179ef93754940a4bcf4371370c909 and timestamp: 1647534722.8106575: INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106 Traceback (most recent call last): File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module> PubsubReadPerfTest().run() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run self.cleanup() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup self.sub_client.delete_subscription(self.read_sub_name) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription request = pubsub.DeleteSubscriptionRequest(request) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__ % (self.__class__.__name__, mapping,) TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_49e91cf1-c4de-45b6-a055-d0a8ceb6c521_read' Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_03_06-16520541463599878889?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_16_45-14787157976749017395?project=apache-beam-testing > Task :sdks:python:apache_beam:testing:load_tests:run FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58 * What went wrong: Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'. > error occurred * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 42m 39s 92 actionable tasks: 73 executed, 17 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/ei2hbo4fxnbac Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
