liad <[email protected]> added the comment:
The gzip module may work for saving file localy but for example:
This upload json to Google Storage:
import datalab.storage as storage
storage.Bucket('mybucket').item(path).write_to(json.dumps(response),
'application/json')
Your won't work here unless I save the file locally and only then upload it...
It's a bit of a problem when your files are 100 GBs+
I still think the json.dump() should support compression
----------
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue34393>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com