[ 
https://issues.apache.org/jira/browse/BEAM-12958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17420024#comment-17420024
 ] 

Fernando Morales commented on BEAM-12958:
-----------------------------------------

Hi [~altay]!

As [~kileys] mentioned, the PR was reverted yesterday a couple of hours after 
it was merged.

> All "Wordcount Dataflow" GHA checks failing with PERMISSION_DENIED
> ------------------------------------------------------------------
>
>                 Key: BEAM-12958
>                 URL: https://issues.apache.org/jira/browse/BEAM-12958
>             Project: Beam
>          Issue Type: Bug
>          Components: test-failures
>            Reporter: Brian Hulette
>            Assignee: Valentyn Tymofieiev
>            Priority: P1
>
> "Wordcount Dataflow" checks for 
> [Java|https://github.com/apache/beam/actions/runs/1268237569] and 
> [Python|https://github.com/apache/beam/runs/3695202750?check_suite_focus=true]
>  have been failing for some time.
> Error from Python:
> {code}
> 4s
> Run python -m apache_beam.examples.wordcount \
> INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 
> seconds.
> INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
> INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
> INFO:oauth2client.client:Refreshing access_token
> INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
> "../../apache-beam-source/apache-beam-source.tar.gz" to staging location.
> WARNING:root:Make sure that locally built Python SDK docker image has Python 
> 3.6 interpreter.
> INFO:root:Default Python SDK image for environment is 
> apache/beam_python3.6_sdk:2.34.0.dev
> INFO:root:Using provided Python SDK container image: 
> gcr.io/cloud-dataflow/v1beta3/python36:beam-master-20210809
> INFO:root:Python SDK container image set to 
> "gcr.io/cloud-dataflow/v1beta3/python36:beam-master-20210809" for Docker 
> environment
> INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
>  <function pack_combiners at 0x7fd41a43ad08> ====================
> INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
>  <function sort_stages at 0x7fd41a438510> ====================
> INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
> temp_location as staging_location: gs://***/tmp/python_wordcount_dataflow/
> INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
> gs://***/tmp/python_wordcount_dataflow/beamapp-runner-0924021925-749470.1632449965.749799/pickled_main_session...
> INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
> gs://***/tmp/python_wordcount_dataflow/beamapp-runner-0924021925-749470.1632449965.749799/pickled_main_session
>  in 0 seconds.
> INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
> gs://***/tmp/python_wordcount_dataflow/beamapp-runner-0924021925-749470.1632449965.749799/dataflow_python_sdk.tar...
> INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
> gs://***/tmp/python_wordcount_dataflow/beamapp-runner-0924021925-749470.1632449965.749799/dataflow_python_sdk.tar
>  in 0 seconds.
> INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
> gs://***/tmp/python_wordcount_dataflow/beamapp-runner-0924021925-749470.1632449965.749799/pipeline.pb...
> INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
> gs://***/tmp/python_wordcount_dataflow/beamapp-runner-0924021925-749470.1632449965.749799/pipeline.pb
>  in 0 seconds.
> Traceback (most recent call last):
>   File "/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/runpy.py", line 
> 193, in _run_module_as_main
>     "__main__", mod_spec)
>   File "/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/runpy.py", line 
> 85, in _run_code
>     exec(code, run_globals)
>   File 
> "/home/runner/work/beam/beam/sdks/python/apache_beam/examples/wordcount.py", 
> line 94, in <module>
>     run()
>   File 
> "/home/runner/work/beam/beam/sdks/python/apache_beam/examples/wordcount.py", 
> line 89, in run
>     output | 'Write' >> WriteToText(known_args.output)
>   File "/home/runner/work/beam/beam/sdks/python/apache_beam/pipeline.py", 
> line 590, in __exit__
>     self.result = self.run()
>   File "/home/runner/work/beam/beam/sdks/python/apache_beam/pipeline.py", 
> line 543, in run
>     self._options).run(False)
>   File "/home/runner/work/beam/beam/sdks/python/apache_beam/pipeline.py", 
> line 567, in run
>     return self.runner.run_pipeline(self, self._options)
>   File 
> "/home/runner/work/beam/beam/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",
>  line 595, in run_pipeline
>     self.dataflow_client.create_job(self.job), self)
>   File "/home/runner/work/beam/beam/sdks/python/apache_beam/utils/retry.py", 
> line 253, in wrapper
>     return fun(*args, **kwargs)
>   File 
> "/home/runner/work/beam/beam/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",
>  line 695, in create_job
>     return self.submit_job_description(job)
>   File "/home/runner/work/beam/beam/sdks/python/apache_beam/utils/retry.py", 
> line 253, in wrapper
>     return fun(*args, **kwargs)
>   File 
> "/home/runner/work/beam/beam/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",
>  line 796, in submit_job_description
>     response = self._client.projects_locations_jobs.Create(request)
>   File 
> "/home/runner/work/beam/beam/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",
>  line 903, in Create
>     config, request, global_params=global_params)
>   File 
> "/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/apitools/base/py/base_api.py",
>  line 731, in _RunMethod
>     return self.ProcessHttpResponse(method_config, http_response, request)
>   File 
> "/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/apitools/base/py/base_api.py",
>  line 737, in ProcessHttpResponse
>     self.__ProcessHttpResponse(method_config, http_response, request))
>   File 
> "/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/apitools/base/py/base_api.py",
>  line 604, in __ProcessHttpResponse
>     http_response, method_config=method_config, request=request)
> apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing 
> <https://dataflow.googleapis.com/v1b3/projects/***/locations/***/jobs?alt=json>:
>  response: <***'vary': 'Origin, X-Origin, Referer', 'content-type': 
> 'application/json; charset=UTF-8', 'date': 'Fri, 24 Sep 2021 02:19:26 GMT', 
> 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
> 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
> 'alt-svc': 'h3=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-T051=":443"; 
> ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; 
> ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"', 
> 'transfer-encoding': 'chunked', 'status': '403', 'content-length': '482', 
> '-content-encoding': 'gzip'***>, content <***
>   "error": ***
>     "code": 403,
>     "message": "(8264fa821ebdd451): Current user cannot act as service 
> account 844138762903-comp...@developer.gserviceaccount.com. Enforced by Org 
> Policy constraint 
> constraints/dataflow.enforceComputeDefaultServiceAccountCheck. 
> https://cloud.google.com/iam/docs/service-accounts-actas Causes: 
> (8264fa821ebddfa9): Current user cannot act as service account 
> 844138762903-comp...@developer.gserviceaccount.com.",
>     "status": "PERMISSION_DENIED"
>   ***
> ***
> >
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to