[
https://issues.apache.org/jira/browse/AIRFLOW-1756?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16219446#comment-16219446
]
Colin Son edited comment on AIRFLOW-1756 at 10/25/17 8:25 PM:
--------------------------------------------------------------
[~ashb]
Sounds good. Can I increase the priority of this ticket, since reading task
logs (that are rotated to S3) is very critical when detecting errors,
debugging, etc?
We look forward to seeing this fix in 1.9.0.
was (Author: ccsn1234):
[~ashb]
Sounds good. Can I increase the priority of this ticket, since reading task
logs (that are rotated to S3) is very critical when detecting errors,
debugging, etc?
> S3 Task Handler Cannot Read Logs With New S3Hook
> ------------------------------------------------
>
> Key: AIRFLOW-1756
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1756
> Project: Apache Airflow
> Issue Type: Bug
> Affects Versions: 1.9.0
> Reporter: Colin Son
> Fix For: 1.9.0
>
>
> With the changes to the S3Hook, it seems like it cannot read the S3 task logs.
> In the `s3_read` in the S3TaskHandler.py:
> {code}
> s3_key = self.hook.get_key(remote_log_location)
> if s3_key:
> return s3_key.get_contents_as_string().decode()
> {code}
> Since the s3_key object is now a dict, you cannot call
> `get_contents_as_string()` on a dict object. You have to use the S3Hook's
> `read_key()` method to read the contents of the task logs now.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)