[jira] [Commented] (AIRFLOW-1797) Cannot write task logs to S3 with Python3

2017-11-09 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16246385#comment-16246385
 ] 

ASF subversion and git services commented on AIRFLOW-1797:
--

Commit 28411b1e7eddb3338a329db3e52ee09de3676784 in incubator-airflow's branch 
refs/heads/master from [~ashb]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=28411b1 ]

[AIRFLOW-1797] S3Hook.load_string didn't work on Python3

With the switch to Boto3 we now need the content
to be bytes, not a
string. On Python2 there is no difference, but for
Python3 this matters.

And since there were no real tests covering the
S3Hook I've added some
basic ones.

Closes #2771 from ashb/AIRFLOW-1797


> Cannot write task logs to S3 with Python3
> -
>
> Key: AIRFLOW-1797
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1797
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Ash Berlin-Taylor
>
> {noformat}
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.5/dist-packages/airflow/utils/log/s3_task_handler.py",
>  line 161, in s3_write
> encrypt=configuration.getboolean('core', 'ENCRYPT_S3_LOGS'),
>   File "/usr/local/lib/python3.5/dist-packages/airflow/hooks/S3_hook.py", 
> line 253, in load_string
> client.upload_fileobj(filelike_buffer, bucket_name, key, 
> ExtraArgs=extra_args)
>   File "/usr/local/lib/python3.5/dist-packages/boto3/s3/inject.py", line 431, 
> in upload_fileobj
> return future.result()
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/futures.py", line 
> 73, in result
> return self._coordinator.result()
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/futures.py", line 
> 233, in result
> raise self._exception
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/tasks.py", line 
> 126, in __call__
> return self._execute_main(kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/tasks.py", line 
> 150, in _execute_main
> return_value = self._main(**kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/upload.py", line 
> 679, in _main
> client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 312, 
> in _api_call
> return self._make_api_call(operation_name, kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 586, 
> in _make_api_call
> request_signer=self._request_signer, context=request_context)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 242, 
> in emit_until_response
> responses = self._emit(event_name, kwargs, stop_on_response=True)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 210, 
> in _emit
> response = handler(**kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 201, in conditionally_calculate_md5
> calculate_md5(params, **kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 179, in calculate_md5
> binary_md5 = _calculate_md5_from_file(body)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 193, in _calculate_md5_from_file
> md5.update(chunk)
> TypeError: Unicode-objects must be encoded before hashing
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (AIRFLOW-1797) Cannot write task logs to S3 with Python3

2017-11-09 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16246391#comment-16246391
 ] 

ASF subversion and git services commented on AIRFLOW-1797:
--

Commit 6b7c17d17b664c74d507dc006eb12cd023feb837 in incubator-airflow's branch 
refs/heads/v1-9-stable from [~ashb]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=6b7c17d ]

[AIRFLOW-1797] S3Hook.load_string didn't work on Python3

With the switch to Boto3 we now need the content
to be bytes, not a
string. On Python2 there is no difference, but for
Python3 this matters.

And since there were no real tests covering the
S3Hook I've added some
basic ones.

Closes #2771 from ashb/AIRFLOW-1797

(cherry picked from commit 28411b1e7eddb3338a329db3e52ee09de3676784)
Signed-off-by: Bolke de Bruin 


> Cannot write task logs to S3 with Python3
> -
>
> Key: AIRFLOW-1797
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1797
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Ash Berlin-Taylor
> Fix For: 1.9.1
>
>
> {noformat}
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.5/dist-packages/airflow/utils/log/s3_task_handler.py",
>  line 161, in s3_write
> encrypt=configuration.getboolean('core', 'ENCRYPT_S3_LOGS'),
>   File "/usr/local/lib/python3.5/dist-packages/airflow/hooks/S3_hook.py", 
> line 253, in load_string
> client.upload_fileobj(filelike_buffer, bucket_name, key, 
> ExtraArgs=extra_args)
>   File "/usr/local/lib/python3.5/dist-packages/boto3/s3/inject.py", line 431, 
> in upload_fileobj
> return future.result()
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/futures.py", line 
> 73, in result
> return self._coordinator.result()
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/futures.py", line 
> 233, in result
> raise self._exception
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/tasks.py", line 
> 126, in __call__
> return self._execute_main(kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/tasks.py", line 
> 150, in _execute_main
> return_value = self._main(**kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/upload.py", line 
> 679, in _main
> client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 312, 
> in _api_call
> return self._make_api_call(operation_name, kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 586, 
> in _make_api_call
> request_signer=self._request_signer, context=request_context)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 242, 
> in emit_until_response
> responses = self._emit(event_name, kwargs, stop_on_response=True)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 210, 
> in _emit
> response = handler(**kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 201, in conditionally_calculate_md5
> calculate_md5(params, **kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 179, in calculate_md5
> binary_md5 = _calculate_md5_from_file(body)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 193, in _calculate_md5_from_file
> md5.update(chunk)
> TypeError: Unicode-objects must be encoded before hashing
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (AIRFLOW-1797) Cannot write task logs to S3 with Python3

2017-11-09 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16246392#comment-16246392
 ] 

ASF subversion and git services commented on AIRFLOW-1797:
--

Commit 6b7c17d17b664c74d507dc006eb12cd023feb837 in incubator-airflow's branch 
refs/heads/v1-9-stable from [~ashb]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=6b7c17d ]

[AIRFLOW-1797] S3Hook.load_string didn't work on Python3

With the switch to Boto3 we now need the content
to be bytes, not a
string. On Python2 there is no difference, but for
Python3 this matters.

And since there were no real tests covering the
S3Hook I've added some
basic ones.

Closes #2771 from ashb/AIRFLOW-1797

(cherry picked from commit 28411b1e7eddb3338a329db3e52ee09de3676784)
Signed-off-by: Bolke de Bruin 


> Cannot write task logs to S3 with Python3
> -
>
> Key: AIRFLOW-1797
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1797
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Ash Berlin-Taylor
> Fix For: 1.9.1
>
>
> {noformat}
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.5/dist-packages/airflow/utils/log/s3_task_handler.py",
>  line 161, in s3_write
> encrypt=configuration.getboolean('core', 'ENCRYPT_S3_LOGS'),
>   File "/usr/local/lib/python3.5/dist-packages/airflow/hooks/S3_hook.py", 
> line 253, in load_string
> client.upload_fileobj(filelike_buffer, bucket_name, key, 
> ExtraArgs=extra_args)
>   File "/usr/local/lib/python3.5/dist-packages/boto3/s3/inject.py", line 431, 
> in upload_fileobj
> return future.result()
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/futures.py", line 
> 73, in result
> return self._coordinator.result()
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/futures.py", line 
> 233, in result
> raise self._exception
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/tasks.py", line 
> 126, in __call__
> return self._execute_main(kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/tasks.py", line 
> 150, in _execute_main
> return_value = self._main(**kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/upload.py", line 
> 679, in _main
> client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 312, 
> in _api_call
> return self._make_api_call(operation_name, kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 586, 
> in _make_api_call
> request_signer=self._request_signer, context=request_context)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 242, 
> in emit_until_response
> responses = self._emit(event_name, kwargs, stop_on_response=True)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 210, 
> in _emit
> response = handler(**kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 201, in conditionally_calculate_md5
> calculate_md5(params, **kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 179, in calculate_md5
> binary_md5 = _calculate_md5_from_file(body)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 193, in _calculate_md5_from_file
> md5.update(chunk)
> TypeError: Unicode-objects must be encoded before hashing
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (AIRFLOW-1797) Cannot write task logs to S3 with Python3

2017-11-09 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16246388#comment-16246388
 ] 

ASF subversion and git services commented on AIRFLOW-1797:
--

Commit d592f891e58650472c8fba89bace3cce54a7972b in incubator-airflow's branch 
refs/heads/v1-9-test from [~ashb]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=d592f89 ]

[AIRFLOW-1797] S3Hook.load_string didn't work on Python3

With the switch to Boto3 we now need the content
to be bytes, not a
string. On Python2 there is no difference, but for
Python3 this matters.

And since there were no real tests covering the
S3Hook I've added some
basic ones.

Closes #2771 from ashb/AIRFLOW-1797

(cherry picked from commit 28411b1e7eddb3338a329db3e52ee09de3676784)
Signed-off-by: Bolke de Bruin 


> Cannot write task logs to S3 with Python3
> -
>
> Key: AIRFLOW-1797
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1797
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Ash Berlin-Taylor
>
> {noformat}
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.5/dist-packages/airflow/utils/log/s3_task_handler.py",
>  line 161, in s3_write
> encrypt=configuration.getboolean('core', 'ENCRYPT_S3_LOGS'),
>   File "/usr/local/lib/python3.5/dist-packages/airflow/hooks/S3_hook.py", 
> line 253, in load_string
> client.upload_fileobj(filelike_buffer, bucket_name, key, 
> ExtraArgs=extra_args)
>   File "/usr/local/lib/python3.5/dist-packages/boto3/s3/inject.py", line 431, 
> in upload_fileobj
> return future.result()
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/futures.py", line 
> 73, in result
> return self._coordinator.result()
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/futures.py", line 
> 233, in result
> raise self._exception
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/tasks.py", line 
> 126, in __call__
> return self._execute_main(kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/tasks.py", line 
> 150, in _execute_main
> return_value = self._main(**kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/s3transfer/upload.py", line 
> 679, in _main
> client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 312, 
> in _api_call
> return self._make_api_call(operation_name, kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 586, 
> in _make_api_call
> request_signer=self._request_signer, context=request_context)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 242, 
> in emit_until_response
> responses = self._emit(event_name, kwargs, stop_on_response=True)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 210, 
> in _emit
> response = handler(**kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 201, in conditionally_calculate_md5
> calculate_md5(params, **kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 179, in calculate_md5
> binary_md5 = _calculate_md5_from_file(body)
>   File "/usr/local/lib/python3.5/dist-packages/botocore/handlers.py", line 
> 193, in _calculate_md5_from_file
> md5.update(chunk)
> TypeError: Unicode-objects must be encoded before hashing
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)