See
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/6051/display/redirect?page=changes>
Changes:
[noreply] Add RunInference example for TensorFlow Hub pre-trained model (#24529)
[noreply] update(PULL Request template) remove Choose reviewer (#24540)
[noreply] Revert "Bump actions/setup-java from 3.6.0 to 3.7.0 (#24484)" (#24551)
[noreply] Interface{}->any for more subfolders (#24553)
------------------------------------------
[...truncated 16.27 MB...]
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1684:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:172:
BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use
ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/ml/gcp/cloud_dlp_it_test.py::CloudDLPIT::test_deidentification
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77:
FutureWarning: MaskDetectedDetails is experimental.
inspection_config=INSPECT_CONFIG))
apache_beam/ml/gcp/cloud_dlp_it_test.py::CloudDLPIT::test_inspection
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87:
FutureWarning: InspectForDetails is experimental.
| beam.ParDo(extract_inspection_results).with_outputs(
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_regression
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/inference/sklearn_japanese_housing_regression.py>:129:
FutureWarning: SklearnModelHandlerPandas is experimental. No
backwards-compatibility guarantees.
model_file_type=ModelFileType.PICKLE, model_uri=model_filename)
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_regression
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/dill/_dill.py>:472:
FutureWarning: SklearnModelHandlerPandas is experimental. No
backwards-compatibility guarantees.
obj = StockUnpickler.load(self)
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:695:
BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use
ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:810:
FutureWarning: ReadAllFromBigQuery is experimental.
| beam.io.ReadAllFromBigQuery())
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2658:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2659:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2672:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
options=pcoll.pipeline.options,
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mFAILED[0m
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py::[1mExerciseMetricsPipelineTest::test_metrics_fnapi_it[0m
- AssertionError: "Unable to match metrics for matcher name: 'ElementCount'
(label_key: 'output_user_name' label_value: 'GroupByKey/Reify-out0').
(label_key: 'original_name' label_value: 'GroupByKey/Reify-out0-ElementCount').
attempted: a value greater than <0> committed: a value greater than <0>Unable
to match metrics for matcher name: 'MeanByteCount' (label_key:
'output_user_name' label_value: 'GroupByKey/Reify-out0'). (label_key:
'original_name' label_value: 'GroupByKey/Reify-out0-MeanByteCount').
attempted: a value greater than <0> committed: a value greater than <0>Unable
to match metrics for matcher name: 'ElementCount' (label_key:
'output_user_name' label_value: 'GroupByKey/GroupByWindow-out0'). (label_key:
'original_name' label_value: 'GroupByKey/GroupByWindow-out0-ElementCount').
attempted: a value greater than <0> committed: a value greater than <0>Unable
to match metrics for matcher name: 'MeanByteCount' (label_key:
'output_user_name' label_value: 'GroupByKey/GroupByWindow-out0'). (label_key:
'original_name' label_value: 'GroupByKey/GroupByWindow-out0-MeanByteCount').
attempted: a value greater than <0> committed: a value greater than <0>Unable
to match metrics for matcher name: 'ElementCount' (label_key:
'output_user_name' label_value: 'GroupByKey/Reify-out0'). (label_key:
'original_name' label_value: 'GroupByKey/Reify-out0-ElementCount'). attempted:
a value greater than <0> committed: a value greater than <0>Unable to match
metrics for matcher name: 'MeanByteCount' (label_key: 'output_user_name'
label_value: 'GroupByKey/Reify-out0'). (label_key: 'original_name'
label_value: 'GroupByKey/Reify-out0-MeanByteCount'). attempted: a value
greater than <0> committed: a value greater than <0>\nActual
MetricResults:\nMetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline.UserMetricsDoFn,
name=total_values), labels={}), committed=100,
attempted=100)\nMetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline.UserMetricsDoFn,
name=distribution_values), labels={}), committed=DistributionResult(sum=100,
count=4, min=0, max=100, mean=25.0), attempted=DistributionResult(sum=100,
count=4, min=0, max=100, mean=25.0))\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=TotalVcpuTime),
labels={'original_name': 'Service-cpu_num_seconds'}), committed=1027,
attempted=1027)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=TotalMemoryUsage),
labels={'original_name': 'Service-mem_mb_seconds'}), committed=4206710,
attempted=4206710)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=TotalPdUsage),
labels={'original_name': 'Service-pd_gb_seconds'}), committed=12837,
attempted=12837)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=TotalSsdUsage),
labels={'original_name': 'Service-pd_ssd_gb_seconds'}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=TotalShuffleDataProcessed),
labels={'original_name': 'Service-shuffle_service_actual_gb'}), committed=None,
attempted=None)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=BillableShuffleDataProcessed),
labels={'original_name': 'Service-shuffle_service_chargeable_gb'}),
committed=None, attempted=None)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=TotalStreamingDataProcessed),
labels={'original_name': 'Service-streaming_service_gb'}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=TotalGpuTime),
labels={'original_name': 'Service-gpu_num_seconds'}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=CurrentShuffleSlotCount),
labels={'original_name': 'dax-shuffle-current-shuffle-slots'}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=CurrentVcpuCount),
labels={'original_name': 'Service-cpu_num'}), committed=2,
attempted=2)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=CurrentMemoryUsage),
labels={'original_name': 'Service-mem_mb'}), committed=8192,
attempted=8192)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=CurrentPdUsage),
labels={'original_name': 'Service-pd_gb'}), committed=25,
attempted=25)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=CurrentSsdUsage),
labels={'original_name': 'Service-pd_ssd_gb'}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=CurrentGpuCount),
labels={'original_name': 'Service-gpu_num'}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=PeakShuffleSlotCount),
labels={'original_name': 'dax-shuffle-peak-shuffle-slots'}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionStepProgress),
labels={'execution_step': 'F16', 'original_name':
'dax_workflow_stage_progress_/workflows/wf-2022-12-06_19_21_40-16219525743346974276/phases/graph_runner/step-invocations/F16-invoke-868847074905018489/map-tasks-completion/map-tasks'}),
committed=1, attempted=1)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'original_name': 'Create/FlatMap(<lambda at
core.py:3501>)-out0-MeanByteCount', 'output_user_name': 'Create/FlatMap(<lambda
at core.py:3501>)-out0'}), committed=14,
attempted=14)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)-out0-MeanByteCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)-out0'}),
committed=27, attempted=27)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'original_name': 'Create/Impulse-out0-MeanByteCount',
'output_user_name': 'Create/Impulse-out0'}), committed=13,
attempted=13)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'output_user_name': 'Create/FlatMap(<lambda at core.py:3501>)-out0',
'original_name': 'Create/FlatMap(<lambda at
core.py:3501>)-out0-ElementCount'}), committed=4,
attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/AddRandomKeys-out0', 'original_name':
'Create/MaybeReshuffle/Reshuffle/AddRandomKeys-out0-MeanByteCount'}),
committed=19, attempted=19)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'original_name': 'Create/Impulse-out0-ElementCount',
'output_user_name': 'Create/Impulse-out0'}), committed=1,
attempted=1)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify-out0-MeanByteCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify-out0'}),
committed=46, attempted=46)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify-out0-ElementCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify-out0'}),
committed=4, attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/AddRandomKeys-out0', 'original_name':
'Create/MaybeReshuffle/Reshuffle/AddRandomKeys-out0-ElementCount'}),
committed=4, attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)-out0',
'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)-out0-ElementCount'}),
committed=4,
attempted=4)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/AddRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/FlatMap(<lambda at
core.py:3501>), metric=MetricName(namespace=dataflow/v1b3,
name=ExecutionTime_FinishBundle), labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/FlatMap(<lambda at
core.py:3501>), metric=MetricName(namespace=dataflow/v1b3,
name=ExecutionTime_StartBundle), labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/AddRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/FlatMap(<lambda at
core.py:3501>), metric=MetricName(namespace=dataflow/v1b3,
name=ExecutionTime_ProcessElement), labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/AddRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionStepProgress),
labels={'execution_step': 'F15', 'original_name':
'dax_workflow_stage_progress_/workflows/wf-2022-12-06_19_21_40-16219525743346974276/phases/graph_runner/step-invocations/F15-invoke-868847074905021409/map-tasks-completion/map-tasks'}),
committed=1, attempted=1)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'output_user_name': 'Create/Map(decode)-out0', 'original_name':
'Create/Map(decode)-out0-ElementCount'}), committed=4,
attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'output_user_name': 'map_to_common_key-out0', 'original_name':
'map_to_common_key-out0-ElementCount'}), committed=4,
attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read-out0',
'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read-out0-ElementCount'}),
committed=4, attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'output_user_name': 'Create/Map(decode)-out0', 'original_name':
'Create/Map(decode)-out0-MeanByteCount'}), committed=14,
attempted=14)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read-out0',
'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read-out0-MeanByteCount'}),
committed=49, attempted=49)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'output_user_name': 'metrics-out0', 'original_name':
'metrics-out0-ElementCount'}), committed=4,
attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys-out0-MeanByteCount',
'output_user_name': 'Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys-out0'}),
committed=14, attempted=14)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)-out0-MeanByteCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)-out0'}),
committed=19, attempted=19)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'output_user_name': 'map_to_common_key-out0', 'original_name':
'map_to_common_key-out0-MeanByteCount'}), committed=18,
attempted=18)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)-out0-ElementCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)-out0'}),
committed=4, attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys-out0', 'original_name':
'Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys-out0-ElementCount'}),
committed=4, attempted=4)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'original_name': 'metrics-out0-MeanByteCount', 'output_user_name':
'metrics-out0'}), committed=14,
attempted=14)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/Map(decode),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/Map(decode),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)\nMetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=3993,
attempted=3993)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/Map(decode),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)\nMetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=946,
attempted=946)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=map_to_common_key,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=map_to_common_key,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)\nMetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=1000,
attempted=1000)\nMetricResult(key=MetricKey(step=map_to_common_key,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0,
attempted=0)\nMetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionStepProgress),
labels={'original_name':
'dax_workflow_stage_progress_/workflows/wf-2022-12-06_19_21_40-16219525743346974276/phases/graph_runner/step-invocations/F14-invoke-868847074905018773/map-tasks-completion/map-tasks',
'execution_step': 'F14'}), committed=1,
attempted=1)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'original_name': 'GroupByKey/Read-out0-ElementCount',
'output_user_name': 'GroupByKey/Read-out0'}), committed=1,
attempted=1)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'output_user_name': 'GroupByKey/Read-out0', 'original_name':
'GroupByKey/Read-out0-MeanByteCount'}), committed=28,
attempted=28)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=MeanByteCount),
labels={'original_name': 'm_out-out0-MeanByteCount', 'output_user_name':
'm_out-out0'}), committed=15, attempted=15)\nMetricResult(key=MetricKey(step=,
metric=MetricName(namespace=dataflow/v1b3, name=ElementCount),
labels={'original_name': 'm_out-out0-ElementCount', 'output_user_name':
'm_out-out0'}), committed=5,
attempted=5)\nMetricResult(key=MetricKey(step=m_out,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)\nMetricResult(key=MetricKey(step=m_out,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)\nMetricResult(key=MetricKey(step=m_out,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)" is not false : Unable to match metrics
for matcher name: 'ElementCount' (label_key: 'output_user_name' label_value:
'GroupByKey/Reify-out0'). (label_key: 'original_name' label_value:
'GroupByKey/Reify-out0-ElementCount'). attempted: a value greater than <0>
committed: a value greater than <0>Unable to match metrics for matcher name:
'MeanByteCount' (label_key: 'output_user_name' label_value:
'GroupByKey/Reify-out0'). (label_key: 'original_name' label_value:
'GroupByKey/Reify-out0-MeanByteCount'). attempted: a value greater than <0>
committed: a value greater than <0>Unable to match metrics for matcher name:
'ElementCount' (label_key: 'output_user_name' label_value:
'GroupByKey/GroupByWindow-out0'). (label_key: 'original_name' label_value:
'GroupByKey/GroupByWindow-out0-ElementCount'). attempted: a value greater than
<0> committed: a value greater than <0>Unable to match metrics for matcher
name: 'MeanByteCount' (label_key: 'output_user_name' label_value:
'GroupByKey/GroupByWindow-out0'). (label_key: 'original_name' label_value:
'GroupByKey/GroupByWindow-out0-MeanByteCount'). attempted: a value greater
than <0> committed: a value greater than <0>Unable to match metrics for matcher
name: 'ElementCount' (label_key: 'output_user_name' label_value:
'GroupByKey/Reify-out0'). (label_key: 'original_name' label_value:
'GroupByKey/Reify-out0-ElementCount'). attempted: a value greater than <0>
committed: a value greater than <0>Unable to match metrics for matcher name:
'MeanByteCount' (label_key: 'output_user_name' label_value:
'GroupByKey/Reify-out0'). (label_key: 'original_name' label_value:
'GroupByKey/Reify-out0-MeanByteCount'). attempted: a value greater than <0>
committed: a value greater than <0>
Actual MetricResults:
MetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline.UserMetricsDoFn,
name=total_values), labels={}), committed=100, attempted=100)
MetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline.UserMetricsDoFn,
name=distribution_values), labels={}), committed=DistributionResult(sum=100,
count=4, min=0, max=100, mean=25.0), attempted=DistributionResult(sum=100,
count=4, min=0, max=100, mean=25.0))
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=TotalVcpuTime), labels={'original_name': 'Service-cpu_num_seconds'}),
committed=1027, attempted=1027)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=TotalMemoryUsage), labels={'original_name': 'Service-mem_mb_seconds'}),
committed=4206710, attempted=4206710)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=TotalPdUsage), labels={'original_name': 'Service-pd_gb_seconds'}),
committed=12837, attempted=12837)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=TotalSsdUsage), labels={'original_name': 'Service-pd_ssd_gb_seconds'}),
committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=TotalShuffleDataProcessed), labels={'original_name':
'Service-shuffle_service_actual_gb'}), committed=None, attempted=None)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=BillableShuffleDataProcessed), labels={'original_name':
'Service-shuffle_service_chargeable_gb'}), committed=None, attempted=None)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=TotalStreamingDataProcessed), labels={'original_name':
'Service-streaming_service_gb'}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=TotalGpuTime), labels={'original_name': 'Service-gpu_num_seconds'}),
committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=CurrentShuffleSlotCount), labels={'original_name':
'dax-shuffle-current-shuffle-slots'}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=CurrentVcpuCount), labels={'original_name': 'Service-cpu_num'}),
committed=2, attempted=2)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=CurrentMemoryUsage), labels={'original_name': 'Service-mem_mb'}),
committed=8192, attempted=8192)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=CurrentPdUsage), labels={'original_name': 'Service-pd_gb'}), committed=25,
attempted=25)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=CurrentSsdUsage), labels={'original_name': 'Service-pd_ssd_gb'}),
committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=CurrentGpuCount), labels={'original_name': 'Service-gpu_num'}),
committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=PeakShuffleSlotCount), labels={'original_name':
'dax-shuffle-peak-shuffle-slots'}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ExecutionStepProgress), labels={'execution_step': 'F16', 'original_name':
'dax_workflow_stage_progress_/workflows/wf-2022-12-06_19_21_40-16219525743346974276/phases/graph_runner/step-invocations/F16-invoke-868847074905018489/map-tasks-completion/map-tasks'}),
committed=1, attempted=1)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'original_name': 'Create/FlatMap(<lambda at
core.py:3501>)-out0-MeanByteCount', 'output_user_name': 'Create/FlatMap(<lambda
at core.py:3501>)-out0'}), committed=14, attempted=14)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)-out0-MeanByteCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)-out0'}),
committed=27, attempted=27)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'original_name':
'Create/Impulse-out0-MeanByteCount', 'output_user_name':
'Create/Impulse-out0'}), committed=13, attempted=13)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'output_user_name': 'Create/FlatMap(<lambda at
core.py:3501>)-out0', 'original_name': 'Create/FlatMap(<lambda at
core.py:3501>)-out0-ElementCount'}), committed=4, attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/AddRandomKeys-out0', 'original_name':
'Create/MaybeReshuffle/Reshuffle/AddRandomKeys-out0-MeanByteCount'}),
committed=19, attempted=19)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'original_name':
'Create/Impulse-out0-ElementCount', 'output_user_name':
'Create/Impulse-out0'}), committed=1, attempted=1)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify-out0-MeanByteCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify-out0'}),
committed=46, attempted=46)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify-out0-ElementCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify-out0'}),
committed=4, attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/AddRandomKeys-out0', 'original_name':
'Create/MaybeReshuffle/Reshuffle/AddRandomKeys-out0-ElementCount'}),
committed=4, attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)-out0',
'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)-out0-ElementCount'}),
committed=4, attempted=4)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/AddRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/FlatMap(<lambda at core.py:3501>),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/FlatMap(<lambda at core.py:3501>),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/AddRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/FlatMap(<lambda at core.py:3501>),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/AddRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ExecutionStepProgress), labels={'execution_step': 'F15', 'original_name':
'dax_workflow_stage_progress_/workflows/wf-2022-12-06_19_21_40-16219525743346974276/phases/graph_runner/step-invocations/F15-invoke-868847074905021409/map-tasks-completion/map-tasks'}),
committed=1, attempted=1)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'output_user_name': 'Create/Map(decode)-out0',
'original_name': 'Create/Map(decode)-out0-ElementCount'}), committed=4,
attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'output_user_name': 'map_to_common_key-out0',
'original_name': 'map_to_common_key-out0-ElementCount'}), committed=4,
attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read-out0',
'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read-out0-ElementCount'}),
committed=4, attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'output_user_name': 'Create/Map(decode)-out0',
'original_name': 'Create/Map(decode)-out0-MeanByteCount'}), committed=14,
attempted=14)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read-out0',
'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read-out0-MeanByteCount'}),
committed=49, attempted=49)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'output_user_name': 'metrics-out0',
'original_name': 'metrics-out0-ElementCount'}), committed=4, attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys-out0-MeanByteCount',
'output_user_name': 'Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys-out0'}),
committed=14, attempted=14)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)-out0-MeanByteCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)-out0'}),
committed=19, attempted=19)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'output_user_name': 'map_to_common_key-out0',
'original_name': 'map_to_common_key-out0-MeanByteCount'}), committed=18,
attempted=18)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'original_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)-out0-ElementCount',
'output_user_name':
'Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)-out0'}),
committed=4, attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'output_user_name':
'Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys-out0', 'original_name':
'Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys-out0-ElementCount'}),
committed=4, attempted=4)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'original_name': 'metrics-out0-MeanByteCount',
'output_user_name': 'metrics-out0'}), committed=14, attempted=14)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/Map(decode),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/Map(decode),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=3993, attempted=3993)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/Map(decode),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=946, attempted=946)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=map_to_common_key,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=map_to_common_key,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=metrics,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=1000, attempted=1000)
MetricResult(key=MetricKey(step=map_to_common_key,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps),
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ExecutionStepProgress), labels={'original_name':
'dax_workflow_stage_progress_/workflows/wf-2022-12-06_19_21_40-16219525743346974276/phases/graph_runner/step-invocations/F14-invoke-868847074905018773/map-tasks-completion/map-tasks',
'execution_step': 'F14'}), committed=1, attempted=1)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'original_name':
'GroupByKey/Read-out0-ElementCount', 'output_user_name':
'GroupByKey/Read-out0'}), committed=1, attempted=1)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'output_user_name': 'GroupByKey/Read-out0',
'original_name': 'GroupByKey/Read-out0-MeanByteCount'}), committed=28,
attempted=28)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=MeanByteCount), labels={'original_name': 'm_out-out0-MeanByteCount',
'output_user_name': 'm_out-out0'}), committed=15, attempted=15)
MetricResult(key=MetricKey(step=, metric=MetricName(namespace=dataflow/v1b3,
name=ElementCount), labels={'original_name': 'm_out-out0-ElementCount',
'output_user_name': 'm_out-out0'}), committed=5, attempted=5)
MetricResult(key=MetricKey(step=m_out,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_StartBundle),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=m_out,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_ProcessElement),
labels={}), committed=0, attempted=0)
MetricResult(key=MetricKey(step=m_out,
metric=MetricName(namespace=dataflow/v1b3, name=ExecutionTime_FinishBundle),
labels={}), committed=0, attempted=0)
[31m===== [31m[1m1 failed[0m, [32m85 passed[0m, [33m19 skipped[0m,
[33m225 warnings[0m[31m in 13802.94s (3:50:02)[0m[31m =====[0m
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options:
>>> --runner=TestDataflowRunner --project=apache-beam-testing
>>> --region=us-central1
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
>>> --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz>
>>> --requirements_file=postcommit_requirements.txt --num_workers=1
>>> --sleep_secs=20
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes
>>> --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>> collect markers: -m=spannerio_it
[1m============================= test session starts
==============================[0m
platform linux -- Python 3.7.12, pytest-7.2.0, pluggy-1.0.0
rootdir:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,>
configfile: pytest.ini
plugins: xdist-2.5.0, hypothesis-6.60.0, timeout-2.1.0, forked-1.4.0,
requests-mock-1.10.0
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0mgw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15]
/ gw7 [15]
scheduling tests via LoadFileScheduling
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
[gw1] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
[gw1] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
[gw0] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
[gw1] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
[gw1] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[gw0] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw1] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[33m=============================== warnings summary
===============================[0m
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15:
DeprecationWarning: the imp module is deprecated in favour of importlib; see
the module's documentation for alternative uses
from imp import load_source
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128:
FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility
guarantees.
sql="select * from Users")
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190:
FutureWarning: WriteToSpanner is experimental. No backwards-compatibility
guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171:
FutureWarning: WriteToSpanner is experimental. No backwards-compatibility
guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117:
FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility
guarantees.
columns=["UserId", "Key"])
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135:
FutureWarning: WriteToSpanner is experimental. No backwards-compatibility
guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml>
-
[33m=========== [32m5 passed[0m, [33m[1m15 skipped[0m, [33m[1m14
warnings[0m[33m in 2030.47s (0:33:50)[0m[33m ============[0m
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 111
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during
this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 4h 27m 42s
212 actionable tasks: 148 executed, 58 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/5mku4a3cr6hdc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]