Does your bucket name has dots(.) ?


On Fri, Jun 10, 2016 at 7:14 AM, Jeremiah Lowin <jlo...@apache.org> wrote:

> Jason,
>
> I will try to figure this out for you tomorrow.
>
> Jeremiah
>
> On Thu, Jun 9, 2016 at 12:20 PM Jason Kromm <jason.kr...@blackboard.com>
> wrote:
>
> > Hi all,
> >
> > I've read the docs, I've looked at the source for s3Hook, and I've brute
> > forced different combinations in my config file but I just can't get this
> > working so I'm hoping someone can give me some insite.
> >
> > I need my logs to go into s3 storage, but no matter how I configure it my
> > s3 bucket remains empty, and I never see any errors in my airflow
> scheduler
> > or workers regarding issues connection, or that it's even attempted to
> > connect.
> >
> > Pip install airflow[s3]
> >
> > Set up a connection in web UI called s3_conn, with type S3, and extra set
> > to {"aws_access_key_id": "mykey", "aws_secret_access_key": "mykey"}
> >
> > In airflow config I set the following
> > remote_base_log_folder = s3://bucket/  (I've tried with and without the
> > trailing slash)
> > remote_log_conn_id = s3_conn
> > encrypt_s3_logs = False
> >
> > Is there some other step that I'm missing?
> >
> > Thanks,
> >
> > Jason Kromm
> > This email and any attachments may contain confidential and proprietary
> > information of Blackboard that is for the sole use of the intended
> > recipient. If you are not the intended recipient, disclosure, copying,
> > re-distribution or other use of any of this information is strictly
> > prohibited. Please immediately notify the sender and delete this
> > transmission if you received this email in error.
> >
>

Reply via email to