HaeSe0ng opened a new issue, #27355: URL: https://github.com/apache/beam/issues/27355
### What happened? I tried to launch Dataflow Flex Template which uses Python SDK and Bigquery Storage API (with expansion service), but this error message is printed. When I remove `method=beam.io.WriteToBigQuery.Method.STORAGE_WRITE_API` option in my code and use default option for bigquery connector, it works. What should I do to avoid this error? ``` 2023-07-05 07:53:14.788 UTC INFO:apache_beam.runners.portability.stager:Executing command: ['/usr/bin/python', '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpt1085nc7/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64'] 2023-07-05 07:53:16.730 UTC INFO:apache_beam.runners.portability.stager:Executing command: ['/usr/bin/python', 'setup.py', 'sdist', '--dist-dir', '/tmp/tmpf1cqqvp4'] 2023-07-05 07:53:16.942 UTC warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md 2023-07-05 07:53:16.943 UTC warning: check: missing required meta-data: url 2023-07-05 07:53:17.058 UTC INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild 2023-07-05 07:53:17.059 UTC INFO:root:Default Python SDK image for environment is apache/beam_python3.9_sdk:2.47.0 2023-07-05 07:53:17.059 UTC INFO:root:Using provided Python SDK container image: us-central1-docker.pkg.dev/prex-data/dataflow-images/test/:latest 2023-07-05 07:53:17.059 UTC INFO:root:Python SDK container image set to "us-central1-docker.pkg.dev/prex-data/dataflow-images/test/:latest" for Docker environment 2023-07-05 07:53:21.441 UTC WARNING:apache_beam.utils.retry:Retry with exponential backoff: waiting for 4.8348830646277605 seconds before retrying _uncached_gcs_file_copy because we caught exception: FileNotFoundError: [Errno 2] No such file or directory: '/tmp/beam-pipeline-tempq_1isprv/tmp5dwo7v0w' ``` ### Issue Priority Priority: 3 (minor) ### Issue Components - [X] Component: Python SDK - [ ] Component: Java SDK - [ ] Component: Go SDK - [ ] Component: Typescript SDK - [X] Component: IO connector - [ ] Component: Beam examples - [ ] Component: Beam playground - [ ] Component: Beam katas - [ ] Component: Website - [ ] Component: Spark Runner - [ ] Component: Flink Runner - [ ] Component: Samza Runner - [ ] Component: Twister2 Runner - [ ] Component: Hazelcast Jet Runner - [X] Component: Google Cloud Dataflow Runner -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
