See 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/270/display/redirect>

Changes:


------------------------------------------
[...truncated 37.79 KB...]
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, 
zstandard, wrapt, websocket-client, urllib3, typing-extensions, tomli, 
threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, 
pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, 
psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, 
objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, 
googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, 
exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, 
certifi, cachetools, attrs, sqlalchemy, scikit-learn, rsa, requests, pytest, 
pymongo, pydot, pyasn1-modules, pandas, hypothesis, httplib2, grpcio-status, 
google-resumable-media, freezegun, cffi, botocore, s3transfer, requests_mock, 
pytest-xdist, pytest-timeout, oauth2client, hdfs, grpc-google-iam-v1, 
google-auth, docker, cryptography, azure-core, testcontainers, 
google-auth-httplib2, google-apitools, google-api-core, boto3, 
azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, 
google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, 
google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, 
google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, 
google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, 
azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 
azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 
boto3-1.26.147 botocore-1.29.147 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 
charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 
deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 
exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 
freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 
google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 
google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 
google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 
google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 
google-cloud-pubsublite-1.8.2 google-cloud-recommendations-ai-0.10.3 
google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 
google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 
googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 
grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 
iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 
msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 
overrides-6.5.0 pandas-1.5.3 parameterized-0.9.0 portalocker-2.7.0 
proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 
pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 
pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 
pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 
requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 
scikit-learn-1.2.2 scipy-1.10.1 sortedcontainers-2.4.0 sqlalchemy-1.4.48 
sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 
tomli-2.0.1 typing-extensions-4.6.3 urllib3-1.26.16 websocket-client-1.5.2 
wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
Collecting torch>=1.7.1 (from -r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached torch-2.0.1-cp38-cp38-manylinux1_x86_64.whl (619.9 MB)
Collecting torchvision>=0.8.2 (from -r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 19))
  Downloading torchvision-0.15.2-cp38-cp38-manylinux1_x86_64.whl (33.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 33.8/33.8 MB 36.7 MB/s eta 0:00:00
Collecting pillow>=8.0.0 (from -r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 20))
  Using cached Pillow-9.5.0-cp38-cp38-manylinux_2_28_x86_64.whl (3.4 MB)
Collecting transformers>=4.18.0 (from -r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 21))
  Downloading transformers-4.29.2-py3-none-any.whl (7.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.1/7.1 MB 71.4 MB/s eta 0:00:00
Requirement already satisfied: filelock in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from torch>=1.7.1->-r apache_beam/ml/inference/torch_tests_requirements.txt 
(line 18)) (3.12.0)
Requirement already satisfied: typing-extensions in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from torch>=1.7.1->-r apache_beam/ml/inference/torch_tests_requirements.txt 
(line 18)) (4.6.3)
Collecting sympy (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached sympy-1.12-py3-none-any.whl (5.7 MB)
Collecting networkx (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached networkx-3.1-py3-none-any.whl (2.1 MB)
Collecting jinja2 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting nvidia-cuda-nvrtc-cu11==11.7.99 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_cuda_nvrtc_cu11-11.7.99-2-py3-none-manylinux1_x86_64.whl 
(21.0 MB)
Collecting nvidia-cuda-runtime-cu11==11.7.99 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_cuda_runtime_cu11-11.7.99-py3-none-manylinux1_x86_64.whl 
(849 kB)
Collecting nvidia-cuda-cupti-cu11==11.7.101 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_cuda_cupti_cu11-11.7.101-py3-none-manylinux1_x86_64.whl 
(11.8 MB)
Collecting nvidia-cudnn-cu11==8.5.0.96 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_cudnn_cu11-8.5.0.96-2-py3-none-manylinux1_x86_64.whl 
(557.1 MB)
Collecting nvidia-cublas-cu11==11.10.3.66 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_cublas_cu11-11.10.3.66-py3-none-manylinux1_x86_64.whl 
(317.1 MB)
Collecting nvidia-cufft-cu11==10.9.0.58 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_cufft_cu11-10.9.0.58-py3-none-manylinux1_x86_64.whl 
(168.4 MB)
Collecting nvidia-curand-cu11==10.2.10.91 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_curand_cu11-10.2.10.91-py3-none-manylinux1_x86_64.whl 
(54.6 MB)
Collecting nvidia-cusolver-cu11==11.4.0.1 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_cusolver_cu11-11.4.0.1-2-py3-none-manylinux1_x86_64.whl 
(102.6 MB)
Collecting nvidia-cusparse-cu11==11.7.4.91 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_cusparse_cu11-11.7.4.91-py3-none-manylinux1_x86_64.whl 
(173.2 MB)
Collecting nvidia-nccl-cu11==2.14.3 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_nccl_cu11-2.14.3-py3-none-manylinux1_x86_64.whl (177.1 MB)
Collecting nvidia-nvtx-cu11==11.7.91 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached nvidia_nvtx_cu11-11.7.91-py3-none-manylinux1_x86_64.whl (98 kB)
Collecting triton==2.0.0 (from torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached 
triton-2.0.0-1-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (63.2 
MB)
Requirement already satisfied: setuptools in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from nvidia-cublas-cu11==11.10.3.66->torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18)) (67.8.0)
Requirement already satisfied: wheel in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from nvidia-cublas-cu11==11.10.3.66->torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18)) (0.40.0)
Collecting cmake (from triton==2.0.0->torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached 
cmake-3.26.3-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (24.0 
MB)
Collecting lit (from triton==2.0.0->torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached lit-16.0.5.post0-py3-none-any.whl
Requirement already satisfied: numpy in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from torchvision>=0.8.2->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 19)) (1.24.3)
Requirement already satisfied: requests in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from torchvision>=0.8.2->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 19)) (2.31.0)
Collecting huggingface-hub<1.0,>=0.14.1 (from transformers>=4.18.0->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 21))
  Downloading huggingface_hub-0.15.1-py3-none-any.whl (236 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 236.8/236.8 kB 7.1 MB/s eta 0:00:00
Requirement already satisfied: packaging>=20.0 in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from transformers>=4.18.0->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 21)) (23.1)
Requirement already satisfied: pyyaml>=5.1 in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from transformers>=4.18.0->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 21)) (6.0)
Requirement already satisfied: regex!=2019.12.17 in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from transformers>=4.18.0->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 21)) (2023.6.3)
Collecting tokenizers!=0.11.3,<0.14,>=0.11.1 (from transformers>=4.18.0->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 21))
  Using cached 
tokenizers-0.13.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.8 
MB)
Collecting tqdm>=4.27 (from transformers>=4.18.0->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 21))
  Using cached tqdm-4.65.0-py3-none-any.whl (77 kB)
Collecting fsspec (from huggingface-hub<1.0,>=0.14.1->transformers>=4.18.0->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 21))
  Using cached fsspec-2023.5.0-py3-none-any.whl (160 kB)
Collecting MarkupSafe>=2.0 (from jinja2->torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached 
MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 
kB)
Requirement already satisfied: charset-normalizer<4,>=2 in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from requests->torchvision>=0.8.2->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 19)) (3.1.0)
Requirement already satisfied: idna<4,>=2.5 in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from requests->torchvision>=0.8.2->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 19)) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from requests->torchvision>=0.8.2->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 19)) (1.26.16)
Requirement already satisfied: certifi>=2017.4.17 in 
<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages>
 (from requests->torchvision>=0.8.2->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 19)) (2023.5.7)
Collecting mpmath>=0.19 (from sympy->torch>=1.7.1->-r 
apache_beam/ml/inference/torch_tests_requirements.txt (line 18))
  Using cached mpmath-1.3.0-py3-none-any.whl (536 kB)
Installing collected packages: tokenizers, mpmath, lit, cmake, tqdm, sympy, 
pillow, nvidia-nvtx-cu11, nvidia-nccl-cu11, nvidia-cusparse-cu11, 
nvidia-curand-cu11, nvidia-cufft-cu11, nvidia-cuda-runtime-cu11, 
nvidia-cuda-nvrtc-cu11, nvidia-cuda-cupti-cu11, nvidia-cublas-cu11, networkx, 
MarkupSafe, fsspec, nvidia-cusolver-cu11, nvidia-cudnn-cu11, jinja2, 
huggingface-hub, transformers, triton, torch, torchvision
Successfully installed MarkupSafe-2.1.3 cmake-3.26.3 fsspec-2023.5.0 
huggingface-hub-0.15.1 jinja2-3.1.2 lit-16.0.5.post0 mpmath-1.3.0 networkx-3.1 
nvidia-cublas-cu11-11.10.3.66 nvidia-cuda-cupti-cu11-11.7.101 
nvidia-cuda-nvrtc-cu11-11.7.99 nvidia-cuda-runtime-cu11-11.7.99 
nvidia-cudnn-cu11-8.5.0.96 nvidia-cufft-cu11-10.9.0.58 
nvidia-curand-cu11-10.2.10.91 nvidia-cusolver-cu11-11.4.0.1 
nvidia-cusparse-cu11-11.7.4.91 nvidia-nccl-cu11-2.14.3 nvidia-nvtx-cu11-11.7.91 
pillow-9.5.0 sympy-1.12 tokenizers-0.13.3 torch-2.0.1 torchvision-0.15.2 
tqdm-4.65.0 transformers-4.29.2 triton-2.0.0
INFO:root:Device is set to CPU
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.portability.stager:Executing command: 
['<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/build/gradleenv/1329484227/bin/python',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpnxby_he8/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp38', '--platform', 'manylinux2014_x86_64']
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python38:beam-master-20230601
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python38:beam-master-20230601" for Docker 
environment
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7f83406c4940> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7f83406c7160> ====================
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/requirements.txt...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/requirements.txt
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/scikit-learn-1.0.2.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/scikit-learn-1.0.2.tar.gz
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/mock-2.0.0-py2.py3-none-any.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/seaborn-0.12.2-py3-none-any.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/seaborn-0.12.2-py3-none-any.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/PyHamcrest-1.10.1-py3-none-any.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/transformers-4.29.2-py3-none-any.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/transformers-4.29.2-py3-none-any.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/inflection-0.5.1-py2.py3-none-any.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/inflection-0.5.1-py2.py3-none-any.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/beautifulsoup4-4.12.2-py3-none-any.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/beautifulsoup4-4.12.2-py3-none-any.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/parameterized-0.7.5-py2.py3-none-any.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/torch-2.0.1-cp38-cp38-manylinux1_x86_64.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/torch-2.0.1-cp38-cp38-manylinux1_x86_64.whl
 in 31 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/torchvision-0.15.2-cp38-cp38-manylinux1_x86_64.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/torchvision-0.15.2-cp38-cp38-manylinux1_x86_64.whl
 in 2 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/Pillow-9.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/Pillow-9.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/matplotlib-3.5.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/matplotlib-3.5.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/matplotlib-3.7.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/matplotlib-3.7.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/matplotlib-3.7.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/matplotlib-3.7.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/scikit_learn-1.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/scikit_learn-1.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/pipeline.pb
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230606175950745518-9447'
 createTime: '2023-06-06T18:00:32.930670Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-06_11_00_31-2138768957490843303'
 location: 'us-central1'
 name: 'benchmark-tests-pytorch-imagenet-python0606162548'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-06T18:00:32.930670Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2023-06-06_11_00_31-2138768957490843303]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2023-06-06_11_00_31-2138768957490843303
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-06_11_00_31-2138768957490843303?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2023-06-06_11_00_31-2138768957490843303 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:32.131Z: 
JOB_MESSAGE_BASIC: Dataflow Runner V2 auto-enabled. Use 
--experiments=disable_runner_v2 to opt out.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:33.380Z: 
JOB_MESSAGE_BASIC: The pipeline is using shuffle service with a (boot) 
persistent disk size / type other than the default. If that configuration was 
intended solely to speed up the non-service shuffle, consider removing it to 
reduce costs as those disks are unused by the shuffle service.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:36.060Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.208Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.232Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.299Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.342Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
WriteOutputToGCS/Write/WriteImpl/GroupByKey: GroupByKey not followed by a 
combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.379Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.406Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.480Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.504Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
WriteOutputToGCS/Write/WriteImpl/InitializeWrite into 
WriteOutputToGCS/Write/WriteImpl/DoOnce/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.528Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
WriteOutputToGCS/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3634>) into 
WriteOutputToGCS/Write/WriteImpl/DoOnce/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.561Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
WriteOutputToGCS/Write/WriteImpl/DoOnce/Map(decode) into 
WriteOutputToGCS/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3634>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.587Z: 
JOB_MESSAGE_DETAILED: Fusing consumer ReadImageNames/Read/Map(<lambda at 
iobase.py:908>) into ReadImageNames/Read/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.617Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
 into ReadImageNames/Read/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.660Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
 into 
ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.702Z: 
JOB_MESSAGE_DETAILED: Fusing consumer FilterEmptyLines into 
ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.740Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
PyTorchRunInference/BeamML_RunInference_Preprocess-0 into FilterEmptyLines
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.782Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
PyTorchRunInference/BatchElements/ParDo(_GlobalWindowsBatchingDoFn) into 
PyTorchRunInference/BeamML_RunInference_Preprocess-0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.823Z: 
JOB_MESSAGE_DETAILED: Fusing consumer PyTorchRunInference/BeamML_RunInference 
into PyTorchRunInference/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.855Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
PyTorchRunInference/BeamML_RunInference_Postprocess-0 into 
PyTorchRunInference/BeamML_RunInference
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.888Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
WriteOutputToGCS/Write/WriteImpl/Map(<lambda at iobase.py:1140>) into 
PyTorchRunInference/BeamML_RunInference_Postprocess-0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.914Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
WriteOutputToGCS/Write/WriteImpl/WindowInto(WindowIntoFn) into 
WriteOutputToGCS/Write/WriteImpl/Map(<lambda at iobase.py:1140>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.952Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
WriteOutputToGCS/Write/WriteImpl/GroupByKey/Write into 
WriteOutputToGCS/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:37.980Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
WriteOutputToGCS/Write/WriteImpl/WriteBundles into 
WriteOutputToGCS/Write/WriteImpl/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.029Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.062Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.096Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.128Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.272Z: 
JOB_MESSAGE_DEBUG: Executing wait step start19
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.327Z: 
JOB_MESSAGE_BASIC: Executing operation 
ReadImageNames/Read/Impulse+ReadImageNames/Read/Map(<lambda at 
iobase.py:908>)+ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.358Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteOutputToGCS/Write/WriteImpl/DoOnce/Impulse+WriteOutputToGCS/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3634>)+WriteOutputToGCS/Write/WriteImpl/DoOnce/Map(decode)+WriteOutputToGCS/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.385Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:00:38.422Z: 
JOB_MESSAGE_BASIC: Starting 75 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2023-06-06_11_00_31-2138768957490843303 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:01:14.376Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:01:23.595Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 72 based on 
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:01:23.627Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 72, though goal was 75. 
 This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:01:33.293Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 74 based on 
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:01:33.321Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 74, though goal was 75. 
 This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:01:43.370Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 75 based on 
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:01:54.755Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.263Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: The Dataflow job appears to be 
stuck because no **** activity has been seen in the last 1h. Please check the 
**** logs in Stackdriver Logging. You can also get help with Cloud Dataflow at 
https://cloud.google.com/dataflow/support.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.358Z: 
JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 
2023-06-06_11_00_31-2138768957490843303.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.377Z: 
JOB_MESSAGE_ERROR: The Dataflow job appears to be stuck because no **** 
activity has been seen in the last 1h. Please check the **** logs in 
Stackdriver Logging. You can also get help with Cloud Dataflow at 
https://cloud.google.com/dataflow/support.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.427Z: 
JOB_MESSAGE_WARNING: Unable to delete temp files: 
"gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/dax-tmp-2023-06-06_11_00_31-2138768957490843303-S01-0-fdfbb8c37e22def4/[email protected]."
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.429Z: 
JOB_MESSAGE_WARNING: Unable to delete temp files: 
"gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/dax-tmp-2023-06-06_11_00_31-2138768957490843303-S04-0-329fc5ad211fab3b/[email protected]."
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.430Z: 
JOB_MESSAGE_WARNING: Unable to delete temp files: 
"gs://temp-storage-for-perf-tests/loadtests/benchmark-tests-pytorch-imagenet-python0606162548.1686074390.744067/dax-tmp-2023-06-06_11_00_31-2138768957490843303-S01-1-bee35b29a3f6b7f8/[email protected]."
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.450Z: 
JOB_MESSAGE_WARNING: 
S04:ReadImageNames/Read/Impulse+ReadImageNames/Read/Map(<lambda at 
iobase.py:908>)+ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
 failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.450Z: 
JOB_MESSAGE_WARNING: 
S01:WriteOutputToGCS/Write/WriteImpl/DoOnce/Impulse+WriteOutputToGCS/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3634>)+WriteOutputToGCS/Write/WriteImpl/DoOnce/Map(decode)+WriteOutputToGCS/Write/WriteImpl/InitializeWrite
 failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.473Z: 
JOB_MESSAGE_BASIC: Finished operation 
ReadImageNames/Read/Impulse+ReadImageNames/Read/Map(<lambda at 
iobase.py:908>)+ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadImageNames-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:39.473Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteOutputToGCS/Write/WriteImpl/DoOnce/Impulse+WriteOutputToGCS/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3634>)+WriteOutputToGCS/Write/WriteImpl/DoOnce/Map(decode)+WriteOutputToGCS/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:40.210Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:40.261Z: 
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:00:40.286Z: 
JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:03:07.071Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool from 75 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:03:07.103Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:03:07.125Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2023-06-06_11_00_31-2138768957490843303 is in state JOB_STATE_FAILED
ERROR:apache_beam.runners.dataflow.dataflow_runner:Console URL: 
https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-06_11_00_31-2138768957490843303?project=<ProjectId>
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/testing/benchmarks/inference/pytorch_image_classification_benchmarks.py";,>
 line 68, in <module>
    PytorchVisionBenchmarkTest().run()
  File 
"<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 148, in run
    self.test()
  File 
"<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/testing/benchmarks/inference/pytorch_image_classification_benchmarks.py";,>
 line 58, in test
    self.result = pytorch_image_classification.run(
  File 
"<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/examples/inference/pytorch_image_classification.py";,>
 line 166, in run
    result.wait_until_finish()
  File 
"<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1563, in wait_until_finish
    raise DataflowRuntimeException(
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
The Dataflow job appears to be stuck because no **** activity has been seen in 
the last 1h. Please check the **** logs in Stackdriver Logging. You can also 
get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 19s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/ke3g62gt44qc6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]


Reply via email to