See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/561/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-13088] Add parameter tempWriteDataset to BigQueryIO 
to store temp

[stranniknm] [BEAM-13466]: sort categories and examples by name

[aydar.zaynutdinov] [BEAM-13485][Playground]

[alexander.zhuravlev] [BEAM-13476] Changed Timeout Error Text

[alexander.zhuravlev] [BEAM-13474] Changed 'Playground' logo text color

[mmack] [BEAM-13443] Avoid blocking put to Kinesis records queue to shutdown

[noreply] Merge pull request #16278 from [BEAM-13479] [Playground] Change logic

[noreply] Merge pull request #16281 from [BEAM-13475] [Playground] [Bugfix] 
Error

[noreply] Merge pull request #16279 from [BEAM-13473] [Playground] [Bugfix] 
Reset

[noreply] Merge pull request #16232 from [BEAM-13285][Playground] Add steps to

[noreply] [BEAM-13399] Add check for dev versions of JARs to download code

[noreply] Merge pull request #16122 from [BEAM-13345] [Playground] add resizable

[benjamin.gonzalez] [BEAM-13088] Make tempDataset final

[noreply] Merge pull request #16170 from [BEAM-13411][Playground] Add getting of

[noreply] Merge pull request #16172 from [BEAM-13417] [Playground] Add java

[noreply] Merge pull request #16240 from [BEAM-13465][Playground] Change error

[noreply] Merge pull request #16234 from [BEAM-13461][Playground] [Bugfix] Error

[noreply] Merge pull request #15489 from [BEAM-12865] Allow custom 
batch_duration

[noreply] [BEAM-13483] Increase timeout of Java Examples Dataflow suite. 
(#16226)

[Kyle Weaver] [BEAM-13496] Upgrade Flink runner to include log4j patches.

[Kyle Weaver] [BEAM-13497] Correct class name in Flink tests.

[noreply] Pin transitive log4j dependencies to 2.17.0 in :sdks:java:io:hcatalog

[noreply] [BEAM-13434] Bump solr to 8.11.1 (#16296)

[noreply] Use a patched shadow 6.1.0 plugin using Log4j 2.16.0 (#16269)

[noreply] [BEAM-12830] Replace GoGradle plugin with Shell Scripts. (#16291)

[stranniknm] [BEAM-13502]: fix loading example on embedded version

[noreply] [BEAM-13430] Start the process of upgrading the Gradle 7. (#16292)


------------------------------------------
[...truncated 53.68 KB...]
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar

> Task :sdks:java:expansion-service:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:expansion-service:classes
> Task :sdks:java:expansion-service:jar

> Task :sdks:java:io:kafka:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:kafka:classes
> Task :sdks:java:io:kafka:jar

> Task :sdks:java:io:google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:google-cloud-platform:classes
> Task :sdks:java:io:google-cloud-platform:jar

> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.36.0.dev
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211213
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211213" for Docker 
environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102077.869527/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102077.869527/pickled_main_session
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102077.869527/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102077.869527/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102077.869527/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102077.869527/dataflow-****.jar
 in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102077.869527/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102077.869527/pipeline.pb
 in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20211221155437870747-4049'
 createTime: '2021-12-21T15:54:43.862562Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-12-21_07_54_43-15551249272402950411'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb1221150501'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-12-21T15:54:43.862562Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2021-12-21_07_54_43-15551249272402950411]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2021-12-21_07_54_43-15551249272402950411
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-21_07_54_43-15551249272402950411?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2021-12-21_07_54_43-15551249272402950411 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:51.039Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:51.860Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:51.912Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.024Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.062Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.096Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.130Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.164Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.206Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.252Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at 
iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.274Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
 into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.300Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
 into 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.329Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.357Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub 
message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.389Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure 
time
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.414Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to 
Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.518Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.546Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.580Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.603Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.676Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.710Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:54:52.729Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:55:20.781Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:55:42.722Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:56:12.826Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T15:56:12.860Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2021-12-21_07_54_43-15551249272402950411 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results 
for test: 1696314ca9aa4f019dd6519f6dcdc8a0 and timestamp: 1640102881.2510648:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
pubsub_io_perf_write_runtime Value: 100
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.36.0.dev
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211213
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211213" for Docker 
environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102886.702978/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102886.702978/pickled_main_session
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102886.702978/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102886.702978/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102886.702978/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102886.702978/dataflow-****.jar
 in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102886.702978/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb1221150501.1640102886.702978/pipeline.pb
 in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20211221160806703964-5679'
 createTime: '2021-12-21T16:08:13.983904Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-12-21_08_08_13-12955318427864248413'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb1221150501'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-12-21T16:08:13.983904Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2021-12-21_08_08_13-12955318427864248413]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2021-12-21_08_08_13-12955318427864248413
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-21_08_08_13-12955318427864248413?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2021-12-21_08_08_13-12955318427864248413 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:21.110Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:22.708Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:22.764Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:22.821Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:22.891Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:22.920Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.059Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.109Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.139Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) 
into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.171Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at 
pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.199Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at 
pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.220Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.252Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.285Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/Combine/ConvertToAccumulators into Count 
messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.332Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/GroupByKey/WriteStream into Count 
messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.369Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into 
Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.391Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.424Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count 
messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.455Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.487Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert 
to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.516Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to 
Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.558Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.593Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.617Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.661Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.719Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.740Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:23.788Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:08:43.035Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:09:08.330Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:09:08.373Z: 
JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be 
a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:09:18.631Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:09:41.083Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-12-21T16:09:41.103Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2021-12-21_08_08_13-12955318427864248413 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 
messages from 
projects/apache-beam-testing/subscriptions/pubsub_io_performance_bd9ca3d4-b3ec-4f38-ae1c-95d60f1e0e8c_read_matcher.
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py";,>
 line 223, in <module>
    PubsubReadPerfTest().run()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 149, in run
    self.result = self.pipeline.run()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py";,>
 line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py";,>
 line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])

Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-21_07_54_43-15551249272402950411?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-21_08_08_13-12955318427864248413?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 46m 52s
65 actionable tasks: 64 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/vmgzzz3gdbrkw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to