Used cloud man to create a new cluster on Feb 22 and picked 500GB as the 
initial size of the data drive. Working with TCGA exome DNA seq data didn't 
take long to fill that up. Used the cloud man admin interface to resize from 
500GB to 1TB and the resize operation took 15 hours. Not sure if that is 
expected so wanted to give some heads up in case that is an area for 
optimization.

Since I now have a local storage problem as I need to work with more than 1TB 
of data I tried to go the route of setting a S3 bucket using Fuse. Ran into a 
problem where the first s3fs software I tried to install had a version issue 
with Ubuntu 10.

I remember something in a support email that better support for Amazon S3 was 
in the works. Can you provide any guidance or thoughts on how to work with more 
than 1TB of data using cost effective S3 versus expensive EBS? The same applies 
for storing results at S3.

With s3fs the file system can hide many of the complexities of moving files 
back and forth with caching where working with 30GB+ files isn't going to be 
fun.

Thanks

Scooter


___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to