[ 
https://issues.apache.org/jira/browse/AIRFLOW-3205?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16649979#comment-16649979
 ] 

jack commented on AIRFLOW-3205:
-------------------------------

Some operators like: MySqlToGoogleCloudStorageOperator do support this behavior.

You can specify the max file size and you also can define the param filename  
with name{}.json

which will create name1.json, name0.json as many as it needs till all records 
were imported.

[https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/operators/mysql_to_gcs.py]

I haven't checked if this behavior is from the operator or from the hook but I 
agree that it would be nice to have such behavior in all operators that 
interact with GoogleStorage.

> GCS: Support multipart upload
> -----------------------------
>
>                 Key: AIRFLOW-3205
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-3205
>             Project: Apache Airflow
>          Issue Type: Improvement
>          Components: gcp
>            Reporter: Gordon Ball
>            Priority: Minor
>
> GoogleCloudStorageHook currently only provides support for uploading files in 
> a single HTTP request. This means that loads fail with SSL errors for files 
> larger than 2GiB (presumably a int32 overflow, might depend on which SSL 
> library is being used). Multipart uploads should be supported to allow large 
> uploads, and possibly increase reliability for smaller uploads.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to