[
https://issues.apache.org/jira/browse/AIRFLOW-4583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16850130#comment-16850130
]
ASF GitHub Bot commented on AIRFLOW-4583:
-----------------------------------------
MateuszJeziorski commented on pull request #5333: [AIRFLOW-4583] Fix writing
GCP keyfile to tempfile in python3
URL: https://github.com/apache/airflow/pull/5333
### Jira
- [x] My PR addresses the following [Airflow
Jira](https://issues.apache.org/jira/browse/AIRFLOW-4583) issues and references
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
- https://issues.apache.org/jira/browse/AIRFLOW-XXX
- In case you are fixing a typo in the documentation you can prepend your
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
- In case you are proposing a fundamental code change, you need to create
an Airflow Improvement Proposal
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
- In case you are adding a dependency, check if the license complies with
the [ASF 3rd Party License
Policy](https://www.apache.org/legal/resolved.html#category-x).
### Description
- [x] Here are some details about my PR, including screenshots of any UI
changes:
Add support for python3 when saving GCP keyfile to tempfile
### Tests
- [x] My PR adds the following unit tests __OR__ does not need testing for
this extremely good reason
### Commits
- [x] My commits all reference Jira issues in their subject lines, and I
have squashed multiple commits if they address the same issue. In addition, my
commits follow the guidelines from "[How to write a good git commit
message](http://chris.beams.io/posts/git-commit/)":
1. Subject is separated from body by a blank line
1. Subject is limited to 50 characters (not including Jira issue reference)
1. Subject does not end with a period
1. Subject uses the imperative mood ("add", not "adding")
1. Body wraps at 72 characters
1. Body explains "what" and "why", not "how"
### Documentation
- [x] In case of new functionality, my PR adds documentation that describes
how to use it.
- All the public functions and the classes in the PR contain docstrings
that explain what it does
- If you implement backwards incompatible changes, please leave a note in
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so
we can assign it to a appropriate release
### Code Quality
- [x] Passes `flake8`
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
> ERROR - a bytes-like object is required, not 'str' while using gcp_conn_id
> and python3
> --------------------------------------------------------------------------------------
>
> Key: AIRFLOW-4583
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4583
> Project: Apache Airflow
> Issue Type: Bug
> Components: gcp
> Affects Versions: 1.10.1
> Reporter: Mateusz Jeziorski
> Priority: Major
>
> I use airflow connection to store GCP service account key. While using the
> {{gcp_conn_id}} in {{GKEPodOperator}} and python3 I'm getting error at the
> method that tries to write key to temp file
> {code}
> [2019-05-28 13:41:02,348] {models.py:1760} ERROR - a bytes-like object is
> required, not 'str'
> Traceback (most recent call last)
> File "/usr/local/lib/airflow/airflow/models.py", line 1659, in _run_raw_tas
> result = task_copy.execute(context=context
> File
> "/usr/local/lib/airflow/airflow/contrib/operators/gcp_container_operator.py",
> line 254, in execut
> key_file = self._set_env_from_extras(extras=extras
> File
> "/usr/local/lib/airflow/airflow/contrib/operators/gcp_container_operator.py",
> line 303, in _set_env_from_extra
> service_key.write(keyfile_json_str
> File "/opt/python3.6/lib/python3.6/tempfile.py", line 485, in func_wrappe
> return func(*args, **kwargs
> TypeError: a bytes-like object is required, not 'str
> [2019-05-28 13:41:02,361] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op [2019-05-28 13:41:02,348] {models.py:1760} ERROR - a bytes-like object
> is required, not 'str'
> [2019-05-28 13:41:02,363] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op Traceback (most recent call last):
> [2019-05-28 13:41:02,363] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/usr/local/lib/airflow/airflow/models.py", line 1659, in
> _run_raw_task
> [2019-05-28 13:41:02,363] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op result = task_copy.execute(context=context)
> [2019-05-28 13:41:02,364] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File
> "/usr/local/lib/airflow/airflow/contrib/operators/gcp_container_operator.py",
> line 254, in execute
> [2019-05-28 13:41:02,364] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op key_file = self._set_env_from_extras(extras=extras)
> [2019-05-28 13:41:02,364] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File
> "/usr/local/lib/airflow/airflow/contrib/operators/gcp_container_operator.py",
> line 303, in _set_env_from_extras
> [2019-05-28 13:41:02,365] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op service_key.write(keyfile_json_str)
> [2019-05-28 13:41:02,365] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/opt/python3.6/lib/python3.6/tempfile.py", line 485, in
> func_wrapper
> [2019-05-28 13:41:02,365] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op return func(*args, **kwargs)
> [2019-05-28 13:41:02,366] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op TypeError: a bytes-like object is required, not 'str'
> [2019-05-28 13:41:02,367] {models.py:1791} INFO - Marking task as FAILED.
> [2019-05-28 13:41:02,368] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op [2019-05-28 13:41:02,367] {models.py:1791} INFO - Marking task as
> FAILED.
> [2019-05-28 13:41:02,416] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op Traceback (most recent call last):
> [2019-05-28 13:41:02,417] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/usr/local/bin/airflow", line 7, in <module>
> [2019-05-28 13:41:02,418] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op exec(compile(f.read(), __file__, 'exec'))
> [2019-05-28 13:41:02,418] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/usr/local/lib/airflow/airflow/bin/airflow", line 32, in
> <module>
> [2019-05-28 13:41:02,419] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op args.func(args)
> [2019-05-28 13:41:02,420] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/usr/local/lib/airflow/airflow/utils/cli.py", line 74, in
> wrapper
> [2019-05-28 13:41:02,421] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op return f(*args, **kwargs)
> [2019-05-28 13:41:02,421] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/usr/local/lib/airflow/airflow/bin/cli.py", line 490, in run
> [2019-05-28 13:41:02,422] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op _run(args, dag, ti)
> [2019-05-28 13:41:02,423] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/usr/local/lib/airflow/airflow/bin/cli.py", line 406, in _run
> [2019-05-28 13:41:02,423] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op pool=args.pool,
> [2019-05-28 13:41:02,423] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/usr/local/lib/airflow/airflow/utils/db.py", line 74, in
> wrapper
> [2019-05-28 13:41:02,425] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op return func(*args, **kwargs)
> [2019-05-28 13:41:02,425] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/usr/local/lib/airflow/airflow/models.py", line 1659, in
> _run_raw_task
> [2019-05-28 13:41:02,426] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op result = task_copy.execute(context=context)
> [2019-05-28 13:41:02,426] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File
> "/usr/local/lib/airflow/airflow/contrib/operators/gcp_container_operator.py",
> line 254, in execute
> [2019-05-28 13:41:02,427] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op key_file = self._set_env_from_extras(extras=extras)
> [2019-05-28 13:41:02,428] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File
> "/usr/local/lib/airflow/airflow/contrib/operators/gcp_container_operator.py",
> line 303, in _set_env_from_extras
> [2019-05-28 13:41:02,428] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op service_key.write(keyfile_json_str)
> [2019-05-28 13:41:02,428] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op File "/opt/python3.6/lib/python3.6/tempfile.py", line 485, in
> func_wrapper
> [2019-05-28 13:41:02,430] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op return func(*args, **kwargs)
> [2019-05-28 13:41:02,430] {base_task_runner.py:101} INFO - Job 2068: Subtask
> pod_op TypeError: a bytes-like object is required, not 'str'
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)