See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/364/display/redirect?page=changes>
Changes: [zyichi] Minor fix to prebuilding sdk workflow timeout setting [Ismaël Mejía] [BEAM-12281] Drop support for Flink 1.10 [Ismaël Mejía] [BEAM-12281] Update Flink Jenkins jobs to use Flink 1.12 [Brian Hulette] Add DataFrame API updates to CHANGES.md [anant.damle] [BEAM-12427] Ignore the AutoValue_* classes generated in "generated" [Ismaël Mejía] [BEAM-12423] Upgrade pyarrow to support version 4.0.0 too [zyichi] [BEAM-12437] Fix broken test from missing allow_unsafe_triggers [noreply] [BEAM-8787] Contribution Guide Improvement (#14477) [noreply] [BEAM-12417] Add UnknownLogicalType to Java for passing through unknown [noreply] Minor: Add problematic path to error message (#14903) ------------------------------------------ [...truncated 57.80 KB...] > Task :sdks:java:extensions:protobuf:jar > Task :runners:core-construction-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:core-construction-java:classes > Task :runners:core-construction-java:jar > Task :runners:core-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:core-java:classes > Task :runners:core-java:jar > Task :sdks:java:extensions:google-cloud-platform-core:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :sdks:java:extensions:google-cloud-platform-core:classes > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:harness:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:harness:classes > Task :sdks:java:harness:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:java-fn-execution:classes > Task :runners:java-fn-execution:jar > Task :sdks:java:expansion-service:compileJava Note: <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/java/expansion-service/src/main/java/org/apache/beam/sdk/expansion/service/ExpansionService.java> uses unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:expansion-service:classes > Task :sdks:java:expansion-service:jar > Task :sdks:java:io:kafka:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:kafka:classes > Task :sdks:java:io:kafka:jar > Task :sdks:java:io:google-cloud-platform:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:classes > Task :sdks:java:io:google-cloud-platform:jar > Task :runners:google-cloud-dataflow-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :runners:google-cloud-dataflow-java:****:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:****:classes > Task :runners:google-cloud-dataflow-java:****:shadowJar > Task :sdks:python:apache_beam:testing:load_tests:run INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.31.0.dev INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210526 INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210526" for Docker environment INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds. INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0602150459.1622649622.744554/pickled_main_session... INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0602150459.1622649622.744554/pickled_main_session in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0602150459.1622649622.744554/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0602150459.1622649622.744554/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0602150459.1622649622.744554/dataflow-****.jar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0602150459.1622649622.744554/dataflow-****.jar in 10 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0602150459.1622649622.744554/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0602150459.1622649622.744554/pipeline.pb in 0 seconds. WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_'] INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job createTime: '2021-06-02T16:00:34.869584Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2021-06-02_09_00_33-6658103690957254039' location: 'us-central1' name: 'performance-tests-psio-python-2gb0602150459' projectId: 'apache-beam-testing' stageStates: [] startTime: '2021-06-02T16:00:34.869584Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-06-02_09_00_33-6658103690957254039] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-06-02_09_00_33-6658103690957254039 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-02_09_00_33-6658103690957254039?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-06-02_09_00_33-6658103690957254039 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:33.493Z: JOB_MESSAGE_BASIC: Streaming Engine auto-enabled. Use --experiments=disable_streaming_engine to opt out. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:33.493Z: JOB_MESSAGE_BASIC: Dataflow Runner V2 auto-enabled. Use --experiments=disable_runner_v2 to opt out. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:40.286Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-b. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.270Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.334Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.466Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.524Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create input/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.694Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.753Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.854Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.955Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:41.988Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.015Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Split into Create input/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.048Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/AddRandomKeys into Create input/Split INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.081Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create input/Reshuffle/AddRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.145Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create input/Reshuffle/ReshufflePerKey/Map(reify_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.168Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create input/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.189Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create input/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.215Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Reshuffle/RemoveRandomKeys into Create input/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.236Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/ReadSplits into Create input/Reshuffle/RemoveRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.259Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into Create input/ReadSplits INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.281Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.349Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.415Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write/NativeWrite into Write to Pubsub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.477Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.506Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.541Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.575Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.662Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.688Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:00:42.749Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:01:18.225Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:01:25.075Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:01:59.387Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:01:59.420Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:10:46.876Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:10:47.778Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:10:47.842Z: JOB_MESSAGE_BASIC: Stopping **** pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:10:47.911Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-02T16:10:47.948Z: JOB_MESSAGE_BASIC: Stopping **** pool... WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-06-02_09_00_33-6658103690957254039 after 605 seconds INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7f5ceba12761499e8e086969c4ac010d and timestamp: 1622650296.2215216: INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 114 Traceback (most recent call last): File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module> Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-02_09_00_33-6658103690957254039?project=apache-beam-testing PubsubReadPerfTest().run() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 147, in run self.test() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 183, in test | 'Write to Pubsub' >> beam.io.WriteToPubSub(self.matcher_topic_name)) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pvalue.py",> line 136, in __or__ return self.pipeline.apply(ptransform, self) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 641, in apply transform.transform, pvalueish, label or transform.label) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 651, in apply return self.apply(transform, pvalueish) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 694, in apply pvalueish_result = self.runner.apply(transform, pvalueish, self._options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 141, in apply return super(DataflowRunner, self).apply(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 185, in apply return m(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 215, in apply_PTransform return transform.expand(input) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/transforms/core.py",> line 1913, in expand | 'UnKey' >> Map(lambda k_v: k_v[1])) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pvalue.py",> line 136, in __or__ return self.pipeline.apply(ptransform, self) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 641, in apply transform.transform, pvalueish, label or transform.label) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 651, in apply return self.apply(transform, pvalueish) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 694, in apply pvalueish_result = self.runner.apply(transform, pvalueish, self._options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 141, in apply return super(DataflowRunner, self).apply(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 185, in apply return m(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 215, in apply_PTransform return transform.expand(input) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/transforms/core.py",> line 2052, in expand return pcoll | GroupByKey() | 'Combine' >> CombineValues( File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pvalue.py",> line 136, in __or__ return self.pipeline.apply(ptransform, self) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 694, in apply pvalueish_result = self.runner.apply(transform, pvalueish, self._options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 141, in apply return super(DataflowRunner, self).apply(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/runner.py",> line 185, in apply return m(transform, input, options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 820, in apply_GroupByKey return transform.expand(pcoll) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/transforms/core.py",> line 2335, in expand raise ValueError(msg) ValueError: Unsafe trigger: `Repeatedly(AfterCount(2097152))` may lose data. Reason: CONDITION_NOT_GUARANTEED. This can be overriden with the --allow_unsafe_triggers flag. > Task :sdks:python:apache_beam:testing:load_tests:run FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58 * What went wrong: Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'. > error occurred * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 22m 18s 80 actionable tasks: 67 executed, 13 from cache Publishing build scan... https://gradle.com/s/wutdoighegwsu Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
