See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/369/display/redirect?page=changes>
Changes: [noreply] Update tensorflow to the latest version [noreply] Update grpcio [Brian Hulette] Use functools.wraps in @progress_indicator [noreply] Add license info for keras-nightly package. [anant.damle] [BEAM-12460] Provide a simpler interface to convert Beam Row to [Udi Meiri] [BEAM-12465] Fix nested subscripted Generics [odidev] Add linux aarch64 wheel build support [noreply] Merge pull request #14949: [BEAM-12356] Cache and shutdown BigQuery [noreply] [BEAM-12379] Verify proxies in frames_test.py, and address some proxy [noreply] Minor: Update link to wordcount pipeline, link to all examples (#14973) [noreply] Add a blog post on how to perform release validations (#13724) [noreply] Merge pull request #11296 from [BEAM-9640] Sketching watermark tracking [Udi Meiri] [BEAM-12469] Fix _unified_repr to not expect __name__ to exist. [heejong] Publish blog article for 2.30.0 release [heejong] Publish 2.30.0 release on Beam website [zyichi] [BEAM-12470] Increase input size of ReshuffleTest.testAssignShardFn to [noreply] [BEAM-8137] Add Main method to ExternalWorkerService (#14942) [pascal.gillet] [BEAM-12471] Fixes NumberFormatException ------------------------------------------ [...truncated 58.49 KB...] Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:core-construction-java:classes > Task :runners:core-construction-java:jar > Task :runners:core-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:core-java:classes > Task :runners:core-java:jar > Task :sdks:java:extensions:google-cloud-platform-core:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :sdks:java:extensions:google-cloud-platform-core:classes > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:harness:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:harness:classes > Task :sdks:java:harness:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:java-fn-execution:classes > Task :runners:java-fn-execution:jar > Task :sdks:java:expansion-service:compileJava Note: <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/java/expansion-service/src/main/java/org/apache/beam/sdk/expansion/service/ExpansionService.java> uses unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:expansion-service:classes > Task :sdks:java:expansion-service:jar > Task :sdks:java:io:kafka:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:kafka:classes > Task :sdks:java:io:kafka:jar > Task :sdks:java:io:google-cloud-platform:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:classes > Task :sdks:java:io:google-cloud-platform:jar > Task :runners:google-cloud-dataflow-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :runners:google-cloud-dataflow-java:****:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:****:classes > Task :runners:google-cloud-dataflow-java:****:shadowJar > Task :sdks:python:apache_beam:testing:load_tests:run INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.32.0.dev INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210526 INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210526" for Docker environment INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds. INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0610150633.1623340875.101927/pickled_main_session... INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0610150633.1623340875.101927/pickled_main_session in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0610150633.1623340875.101927/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0610150633.1623340875.101927/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0610150633.1623340875.101927/dataflow-****.jar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0610150633.1623340875.101927/dataflow-****.jar in 6 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0610150633.1623340875.101927/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0610150633.1623340875.101927/pipeline.pb in 0 seconds. WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job createTime: '2021-06-10T16:01:24.153328Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2021-06-10_09_01_23-8587661447571009518' location: 'us-central1' name: 'performance-tests-psio-python-2gb0610150633' projectId: 'apache-beam-testing' stageStates: [] startTime: '2021-06-10T16:01:24.153328Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-06-10_09_01_23-8587661447571009518] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-06-10_09_01_23-8587661447571009518 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-10_09_01_23-8587661447571009518?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-06-10_09_01_23-8587661447571009518 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:23.121Z: JOB_MESSAGE_BASIC: Dataflow Runner V2 auto-enabled. Use --experiments=disable_runner_v2 to opt out. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:23.121Z: JOB_MESSAGE_BASIC: Streaming Engine auto-enabled. Use --experiments=disable_streaming_engine to opt out. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:28.547Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-b. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.439Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.525Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.672Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.717Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create input/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.760Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.792Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.895Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.952Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:29.998Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.035Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Split into Create input/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.076Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/AddRandomKeys into Create input/Split INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.118Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create input/Reshuffle/AddRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.152Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create input/Reshuffle/ReshufflePerKey/Map(reify_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.200Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create input/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.241Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create input/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.335Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/RemoveRandomKeys into Create input/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.420Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/ReadSplits into Create input/Reshuffle/RemoveRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.474Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into Create input/ReadSplits INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.588Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.641Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.695Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write/NativeWrite into Write to Pubsub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.811Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.879Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:30.992Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:31.044Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:31.126Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:31.169Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:31.198Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:01:43.566Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:02:15.263Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:02:15.296Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:02:50.405Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:02:50.445Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:04:08.384Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:11:33.566Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:11:33.713Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:11:33.759Z: JOB_MESSAGE_BASIC: Stopping **** pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:11:33.794Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-10T16:11:33.828Z: JOB_MESSAGE_BASIC: Stopping **** pool... WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-06-10_09_01_23-8587661447571009518 after 604 seconds INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 10e5f90ef7054db5b02f6cf4cfd870ac and timestamp: 1623341560.7133982: INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 173 Traceback (most recent call last): File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module> PubsubReadPerfTest().run() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 147, in run self.test() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 183, in test | 'Write to Pubsub' >> beam.io.WriteToPubSub(self.matcher_topic_name)) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pvalue.py",> line 136, in __or__ return self.pipeline.apply(ptransform, self) Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-10_09_01_23-8587661447571009518?project=apache-beam-testing File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 641, in apply transform.transform, pvalueish, label or transform.label) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 651, in apply return self.apply(transform, pvalueish) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 694, in apply pvalueish_result = self.runner.apply(transform, pvalueish, self._options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 141, in apply return super(DataflowRunner, self).apply(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 185, in apply return m(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 215, in apply_PTransform return transform.expand(input) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/transforms/core.py",> line 1913, in expand | 'UnKey' >> Map(lambda k_v: k_v[1])) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pvalue.py",> line 136, in __or__ return self.pipeline.apply(ptransform, self) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 641, in apply transform.transform, pvalueish, label or transform.label) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 651, in apply return self.apply(transform, pvalueish) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 694, in apply pvalueish_result = self.runner.apply(transform, pvalueish, self._options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 141, in apply return super(DataflowRunner, self).apply(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 185, in apply return m(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 215, in apply_PTransform return transform.expand(input) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/transforms/core.py",> line 2052, in expand return pcoll | GroupByKey() | 'Combine' >> CombineValues( File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pvalue.py",> line 136, in __or__ return self.pipeline.apply(ptransform, self) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 694, in apply pvalueish_result = self.runner.apply(transform, pvalueish, self._options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 141, in apply return super(DataflowRunner, self).apply(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 185, in apply return m(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 820, in apply_GroupByKey return transform.expand(pcoll) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/transforms/core.py",> line 2335, in expand raise ValueError(msg) ValueError: Unsafe trigger: `Repeatedly(AfterCount(2097152))` may lose data. Reason: CONDITION_NOT_GUARANTEED. This can be overriden with the --allow_unsafe_triggers flag. > Task :sdks:python:apache_beam:testing:load_tests:run FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58 * What went wrong: Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'. > error occurred * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 23m 23s 80 actionable tasks: 67 executed, 13 from cache Publishing build scan... https://gradle.com/s/nkirlp6h4js4a Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
