[
https://issues.apache.org/jira/browse/AIRFLOW-115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16196979#comment-16196979
]
Michael Crawford commented on AIRFLOW-115:
------------------------------------------
hmm yes it seems they have made the s3 hook be more like the aws ones and use
login password or fall back to native boto3.
I personally prefer this method, although if we are going to make the key and
password first class citizens in the connection why not make region one as
well.
Also we should probably change the display labels so people know that they can
add them there.
To make this somewhat backward compatible I would think we should let it fall
back to inspecting the extra field.
> Migrate and Refactor AWS integration to use boto3 and better structured hooks
> -----------------------------------------------------------------------------
>
> Key: AIRFLOW-115
> URL: https://issues.apache.org/jira/browse/AIRFLOW-115
> Project: Apache Airflow
> Issue Type: Improvement
> Components: aws, boto3, hooks
> Reporter: Arthur Wiedmer
> Assignee: Arthur Wiedmer
> Priority: Minor
>
> h2. Current State
> The current AWS integration is mostly done through the S3Hook, which uses non
> standard credentials parsing on top of using boto instead of boto3 which is
> the current supported AWS sdk for Python.
> h2. Proposal
> an AWSHook should be provided that maps Airflow connections to the boto3 API.
> Operators working with s3, as well as other AWS services would then inherit
> from this hook but extend the functionality with service specific methods
> like get_key for S3, start_cluster for EMR, enqueue for SQS, send_email for
> SES etc...
> * AWSHook
> ** S3Hook
> ** EMRHook
> ** SQSHook
> ** SESHook
> ...
>
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)