Updated Branches: refs/heads/trunk 9b84bd031 -> 047452e1c
docs: Add some docs and example for the S3 driver. Project: http://git-wip-us.apache.org/repos/asf/libcloud/repo Commit: http://git-wip-us.apache.org/repos/asf/libcloud/commit/c6df52e4 Tree: http://git-wip-us.apache.org/repos/asf/libcloud/tree/c6df52e4 Diff: http://git-wip-us.apache.org/repos/asf/libcloud/diff/c6df52e4 Branch: refs/heads/trunk Commit: c6df52e4eef30618eb97428ea55a0e253a1d96a1 Parents: 9b84bd0 Author: Tomaz Muraus <[email protected]> Authored: Fri Jan 10 19:08:33 2014 +0100 Committer: Tomaz Muraus <[email protected]> Committed: Fri Jan 10 19:08:33 2014 +0100 ---------------------------------------------------------------------- .../storage/s3/multipart_large_file_upload.py | 19 +++++++++ docs/storage/drivers/s3.rst | 42 ++++++++++++++++++++ 2 files changed, 61 insertions(+) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/libcloud/blob/c6df52e4/docs/examples/storage/s3/multipart_large_file_upload.py ---------------------------------------------------------------------- diff --git a/docs/examples/storage/s3/multipart_large_file_upload.py b/docs/examples/storage/s3/multipart_large_file_upload.py new file mode 100644 index 0000000..61e4e41 --- /dev/null +++ b/docs/examples/storage/s3/multipart_large_file_upload.py @@ -0,0 +1,19 @@ +from libcloud.storage.types import Provider +from libcloud.storage.providers import get_driver + +# Path to a very large file you want to upload +FILE_PATH = '/home/user/myfile.tar.gz' + +cls = get_driver(Provider.S3) +driver = cls('api key', 'api secret key') + +container = driver.get_container(container_name='my-backups-12345') + +# This method blocks until all the parts have been uploaded. +extra = {'content_type': 'application/octet-stream'} + +with open(FILE_PATH, 'rb') as iterator: + obj = driver.upload_object_via_stream(iterator=iterator, + container=container, + object_name='backup.tar.gz', + extra=extra) http://git-wip-us.apache.org/repos/asf/libcloud/blob/c6df52e4/docs/storage/drivers/s3.rst ---------------------------------------------------------------------- diff --git a/docs/storage/drivers/s3.rst b/docs/storage/drivers/s3.rst new file mode 100644 index 0000000..f3213ff --- /dev/null +++ b/docs/storage/drivers/s3.rst @@ -0,0 +1,42 @@ +Amazon S3 Storage Driver Documentation +====================================== + +`Amazon Simple Storage Service (Amazon S3)`_ is an online cloud storage service +from Amazon Web Services. + +Multipart uploads +------------------ + +Amazon S3 driver supports multipart uploads which means you can upload objects +with a total size of up to 5 TB. + +Multipart upload works by splitting an object in multiple parts and uploading +those parts to S3. After all the parts of the object are uploaded, Amazon S3 +assembles those parts and creates an object. + +If you use +:meth:`libcloud.storage.base.StorageDriver.upload_object_via_stream` method, +Libcloud transparently handles all the splitting and uploading of the parts +for you. + +By default, to prevent excessive buffering and use of memory, each part is +5 MB in size. This is also the smallest size of a part you can use with the +multi part upload. + +Examples +-------- + +1. Uploading a very large file using upload_object_via_stream method +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +This approach shows how you can upload a very large file using +``upload_object_via_stream`` method. + +Keep in mind that exactly the same approach and method can also be used for +uploading other arbitrary sized files. There is no minimum size limit and +you can even upload / create empty objects. + +.. literalinclude:: /examples/storage/s3/multipart_large_file_upload.py + :language: python + +.. _`Amazon Simple Storage Service (Amazon S3)`: http://aws.amazon.com/s3/
